key: cord-0033248-waus29er authors: Yap, F title: Increase in methicillin-resistant Staphylococcus aureus acquisition and change in pathogen pattern associated with outbreaks of severe acute respiratory syndrome (SARS) date: 2004-03-15 journal: Crit Care DOI: 10.1186/cc2687 sha: 72a5640aa0c307fbe171ca7ad55d3fda48b53988 doc_id: 33248 cord_uid: waus29er nan A bougie was used during 46. This technique does not use bronchoscopic control. A bougie is passed through the tracheal tube (TT) into the trachea. The TT is withdrawn until the cuff is above the vocal cords. With the cuff fully inflated, the TT is advanced (using the bougie as a guide) until the cuff impacts on the vocal cords. A gas- Table 1 Complications (type) Number Visible cut in bougie 2 a TT not pulled back far enough 2 b Bougie in Murphy eye of TT 1 c Failure to advance blue rhino 1 d Bleeding 1 e Late complication 2 f a In one case the seldinger needle also blocked by bougie material. One patient was in a hard collar. b In one case, the seldinger needle perforated the TT cuff and the TT prevented tracheostomy cuff sealing. In one case, the seldinger wire passed through Murphy eye of TT and the dilation process caused the TT to advance into the trachea. The second passage of the dilator was done with bronchoscopic control. c Caused no problems. d Patient previous tracheostomy: serial dilator technique used. e Thrombocytopaenic patient. No damage seen on bronchoscopy. f Infected stoma at 5 days, pneumopericardium at 4 days. Introduction Noninvasive mechanical ventilation (NIMV) is the support of the ventilation of the patient using kinds of mechanical means but without intubation. The advantages of NIMV are well known (autonomous interruption and adaptation, intervals in use, self-performed, no humidification needed, no sedation, possibility of expectorations, coughing, talking, feeding, and so on) and the disadvantages of IMV (during intubation [spasm, heart attack, dilatation of stomach], during IMV obstruction, ventilatorassociated pneumonia [VAP] , take-off, after intubation, tracheostomia). Among different types of NIMV (negative pressure [iron lung], positive pressure [intermittent positive pressure ventilation, spontaneous intermittent mechanical ventilation, PSV, BiPAP, PAV], CPAP), in recent years a face CPAP mask (Boussignac-Vygon) has been developed and used in a mode of CPAP ventilation. An O 2 supply is used to create through a special valve mechanism continuous positive pressure in the airways. Although this mask is very effective to correct hypoxemia, it has two serious disadvantages: (1) it uses very high flow of O 2 to create an effective CPAP, resulting in an uncontrolled FiO 2 more than 60% (up to 95%) in the majority Introduction Intrapulmonary Percussive Ventilation ® (IPV) is a complementary chest physiotherapy technique for restrictive and obstructive disease treatment. We tested its tolerance and efficacy after cardiac surgery in a prospective randomised controlled study. Patients and methods Forty patients receiving chest physiotherapy during their first 3 days after cardiac surgery were randomised in two groups: the control group benefited from one conventional chest physiotherapy per day, and the IPV group received same treatment alternated with IPV1 ® application each day. Treatment duration was the same. Blood gas, flow-volume curve, subjective sensation of dyspnoea (using visual scale), thorax radiograph, postoperative complications and hospital length of stay were analysed. Results (Table 1 ) Both groups were comparable regardless of type of intervention, comorbidities and demography. Blood gas analysis (days 1 and 3) and frequency of postoperative clinical complications were not statistically different. However, fewer atelectasis were seen in the IPV group on thorax radiograph. Subjective dyspnoea sensation is less in the patients of the IPV group. Hospital length of stay (LOS) was longer in the control group. No IPV treatment was stopped for intolerance. Conclusion Tolerance to IPV treatment was good early after cardiac surgery. In this study, use of IPV during the first three postoperative days reduces hospital LOS after cardiac surgery. However, a larger study is necessary to confirm these results. To measure it we used a direct method with certain technical innovations. The subjects breathed through an oronasal mask and one-way valve. Expired air was collected in a 5-l balloon in order to provide adequate mixture of dead space and alveolar air. The concentration of CO 2 on the end of the balloon (PECO 2 ) was measured with an RGM 'Ohmeda' capnograph (infrared technique) together with minute ventilation. Arterial oxygen partial pressure (PaO 2 ) was measured simultaneously from arterial blood using a conventional method. A proteomics approach to ventilator-induced lung injury might identify protein patterns that contribute to epithelial injury. To identify changes in alveolar type II cells (ATII), rats were mechanically ventilated for 5 hours with a high tidal volume (HTV; 20 ml/kg, no positive end expiratory pressure) or a low tidal volume (LTV; 6 ml/kg, positive end expiratory pressure 4 cmH 2 O) and compared with pooled controls without mechanical ventilation (SV). ATII were isolated and lysed. Protein expression was compared using the recently introduced cleavable isotope coded affinity tag (ICAT) methodology. After tryptic digestion, cysteine containing peptides were tagged with biotin, extracted using an avidin-coated column and identified by HPLC and mass spectrometry with collision-induced dissociation. Spectra were interrogated against the Swissprot database and quantified using the ProteinProspector software. HTV ventilation resulted in morphologic changes, pulmonary edema and neutrophil influx in the lung. Quantization showed differential expression of several hundred proteins following both HTV and LTV ventilation. Most proteins were upregulated by LTV ventilation compared with SV. For some proteins, upregulation was markedly attenuated after HTV compared with LTV. This was not true for ICAM-1, which correlated with the increased number of neutrophils in these samples. After HTV, surfactant proteins B and D followed different patterns. Since the differentially expressed proteins play important roles in innate immunity and in pulmonary edema fluid clearance, the attenuated response of ATII after HTV ventilation might indicate a clinically relevant impediment. P25 Inspiratory vs expiratory limb of the pressure-volume curve for the positive end-expiratory pressure setting in acute lung injury Results Pressure-controlled or pressure support MV modes were most often used (97.7%). No differences were seen between the four MV modes (ASV, IPPV, ASB and PC) with respect to applied Vt and PEEP. Higher levels of PEEP were used in ARDS patients (P = 0.002). Importantly, in the majority of patients Vt was > 8 ml/kg. Moreover, Vt was significantly higher in ARDS patients than in non-ARDS patients (P = 0.002). There is a large discrepancy between ideal Vt and actually applied Vt. A possible explanation is the use of absolute bodyweight instead of ideal bodyweight for calculation of Vt. Implementation of small Vt-MV needs an educational program and evaluation, which is currently in progress in all three ICUs. The initially applied PEEP and the determined optimal PEEP did not differ significantly. Compared with baseline (T0) the PaO 2 was significantly higher following alveolar recruitment (T ep ) and also after 60 min. The EVLW did not change significantly during the study period (Table 1) . Regression analysis could not reveal any significant relationship between PaO 2 and EVLW at different PEEP. Recently we found significant correlation between EVLW and PaO 2 /FiO 2 and PEEP levels. To date no clinical trials reported the relationship between oxygenation and EVLW during alveolar recruitment and PEEP optimisation. Based on the current results of this pilot study it seems that the EVLW does not directly affect oxygenation during recruitment. Completion of the study is required to evaluate the effect of EVLW on the optimal PEEP in ARDS. 1. Szakmany T, Heigl P, Molnar Zs: Anaesth Int Care 2004 (in press). Introduction Pneumothorax is present in 48.8% of cases of acute respiratory distress syndrome (ARDS), its development becomes more likely as the duration of the process increases and its presence affects the patient's chances of survival. Bronchopleural fistulas prolong pneumothorax in 2% of cases of ARDS, increasing the rate of mortality by 26%. No prospective controlled studies have been carried out into the management of persistent air leak (PAL) in ARDS, nor has any therapeutic option for its treatment been proved superior to another. Pleurodesis by autologous blood (PAB) is an effective, simple and inexpensive method in a select number of cases of oncological pulmonary surgery; also, there are anecdotal descriptions of its use with nonsurgical patients. Objective The goal of this study is to compare the efficacy of PAB with the conventional continuous aspiration in the management of PAL in ARDS patients with pneumothorax. Design A nonrandomized study comparing two groups undergoing artificial pairing 1:1. Patients Two groups of 17 patients all with ARDS, pneumothorax and PAL. Interventions One group had undergone conventional treatment while the other received PAB. The average difference between the groups is 8 days less seal time (P < 0.001), 11 days less weaning time (P < 0.001), and 9 days less, on average, time spent in ICU (P < 0.001). Conclusions The use of PAB in comparison with the exclusive use of a thoracic drain with a water seal for the treatment of PAL in ARDS is an effective, inexpensive and simple method that decreases the ventilator weaning time and makes for a shorter stay in the ICU. Table 1 The increase between day 1 and day 7 of respiratory values of studies Introduction Open lung biopsy (OLB) is regarded as the gold standard for the diagnosis of pulmonary infiltrates. This procedure is associated with significant risk of morbidity and mortality in the intensive care unit (ICU) setting. We wished to study the efficacy of open lung biopsies in ICU patients requiring advanced respiratory support associated with bilateral pulmonary infiltrates. Methods We reviewed medical case records of all patients who underwent open lung biopsy on the ICU between August 1996 and June 2003. We included patients requiring advanced ventilatory support, with persistent bilateral infiltrates on chest X-ray failing to respond to first-line treatment. The biopsies were performed in the ICU at the University Hospital of Wales under general anaesthesia. Access to the lung was via anterior mini thoracotomy. All operative lung samples were sent for bacteriology Gram staining and culture, virology and histology assessment. We performed 13 OLBs in the study period. The patient demographics (survivors versus nonsurvivors) are presented in Table 1 . Open lung biopsy provided a diagnosis in 12 out of 13 patients (92%). In all 12 of these patients specific treatment changes were implemented. Overall mortality was 53.8% Conclusion OLB in critically ill patients is an accurate diagnostic tool. Background Acute respiratory distress syndrome (ARDS) is characterised by the development of noncardiogenic pulmonary oedema leading to refractory hypoxaemia. The early clearance of alveolar oedema is an important determinant of outcome in ARDS. β-agonists have been shown to upregulate alveolar fluid clearance (AFC). The primary aim of this study was to determine the concentration of salbutamol achieved in plasma after treatment with intravenous (IV) salbutamol. A second aim was to determine the concentration of airspace salbutamol required to stimulate AFC in the normal rat lung. Methods Patients with acute lung injury/ARDS were randomised to an infusion of IV salbutamol (15 µg/kg/hour) or placebo. Plasma salbutamol levels were measured by a commercially available ELISA before and 24 hours after starting the infusion. In a second series of experiments AFC was measured in rats in response to 10 -4 M to 10 -8 M salbutamol as previously described [1] . Results Twelve patients were recruited (six received placebo). Median plasma salbutamol levels were 408 ng/ml in the salbutamol-treated group (equivalent to 10 -6 M) compared with undetectable in the placebo group (P = 0.001). Basal AFC in rats was 7.6 ± 2.2% hour. Salbutamol (10 -6 to 10 -5 M) increased AFC by 100% and 400%, respectively, above baseline (P < 0.05). The plasma concentration (10 -6 M) following IV administration in patients with ARDS may be sufficient to stimulate AFC if a similar concentration reaches the airspaces. Aerosolised delivery of a beta-2 agonist can achieve higher concentrations in the distal airspaces and maximal AFC. Although both IV and aerosolised delivery of a β2-agonist can achieve an effective concentration, the optimal route of delivery for treating ARDS patients remains uncertain. Introduction The prone position has been shown to attenuate ventilator-induced lung injury in experimental models [1, 2] . Recently, it has been suggested that the mechanism may be due to a more homogeneous distribution of strain within the lung parenchyma [3] . This study was performed to explore this hypothesis. Methods A total of 30 animals were ventilated in supine (n = 15) or prone (n = 15) positions until a similar ventilator-induced lung injury (VILI) was reached. To do so, experiment was interrupted when respiratory system elastance (Ers) was 150% of baseline [4] . Thereafter VILI was assessed as the lung wet-to-dry (W/D) ratio and histology (H&H stain). In five more animals CT scans (GE Medical System Light Speed QX/I, 0.6 mm thickness, 100 mA, 100 Kv) were taken at end-expiration and end-inspiration (90% of inspiratory capacity) in both supine and prone positions. Quantitative analysis (Maluna, 2.02 Mannheim, Germany) was performed on the entire lung and dividing the lung into four zones along the vertical axis, from ventral to dorsal. We also considered the area of the lung as seen from a lateral view considering the height at each scan, dividing it into upper and lower regions according to maximal height of the lung at end-expiration (Osiris, Medical imaging software 3.6, Geneva, Switzerland). The tidal volume distribution was then calculated as the ratio between the area at end-inspiration and end-expiration. This was calculated for upper and lower regions in both the supine and prone positions. Data are shown as mean ± SD. Results Rats were ventilated with comparable ventilator settings (tidal volume of 88.7 ± 8% of inspiratory capacity vs 88.6 ± 4.9, supine vs prone, P = not significant [NS] ). Similar VILI was reached, as assessed by Ers, W/D and histology. The time taken to achieve the target VILI was longer with prone position (73 ± 37 min vs 112 ± 42, supine vs prone, P < 0.05). When considering CT scan analysis, lung volumes were similar between groups, both at end-expiration (1.84 ± 0.25 ml vs 1.97 ± 0.16, supine vs prone, P = NS) and end-inspiration (10.07 ± 2.79 mL vs 9.52 ± 2.49, supine vs prone, P = NS). However, at end-expiration, lung density along the vertebral axis was similar in the prone position, while it was significantly decreased in the supine position (ANOVA, P < 0.05). The ratio between end-inspiratory and end-expiratory areas was greater in the upper region as opposed to the lower region in the supine position (1.64 ± 0.12 vs 0.74 ± 0.07, upper vs lower region, P < 0.05), while in the prone position it was similar (1.21 ± 0.12 vs 1.13 ± 0.07, upper vs lower, P = NS). The prone position attenuates VILI, allowing one to buy time on its progression. Both lung volume at end-expiration and tidal volume are more homogeneously distributed in the prone position, suggesting a more homogeneous distribution of strain within lung parenchima. This may explain the protective role of the prone position on the progression of VILI. Introduction Continuous rotation therapy (CRT) using specially designed beds is known to prevent respiratory complications in ventilated patients and to exert beneficial effects in patients suffering from respiratory failure due to acute lung injury (ALI) and/or ARDS. Little is known about the physiologic effects and potentially beneficial mechanisms of CRT. Moreover, data on the optimal setting of the turning angle and the duration of the pause in maximal lateral steep position are rare. Methods Twelve patients suffering from ALI and or ARDS were investigated. Patients were included if a pulmonary artery catheter had been placed and the decision to perform CRT had been taken. Patients were placed into a CRT positioning system (Rotorest, KCI, San Antonio, TX, USA) and the study was performed after 1-3 days of turning. In a first phase, oxygenation and hemodynamics were assessed during continuous turning in the supine and both maximal steep positions at an angle of 62°. In a second phase, the patients were positioned in both maximal steep positions for half an hour. Oxygenation and hemodynamics were assessed in the supine position and every 10 min during steep positioning. All patients were sedated and mechanically ventilated using a pressure-controlled mode. Ventilation parameters were not changed throughout the study period. Results Ten male and two female patients fulfilling the diagnostic criteria of ARDS were evaluated. The median age was 54 years (range 22-81 years). Ten patients tolerated prolonged lateral steep positioning well; in two patients the study had to be broken off prematurely in the left lateral position due to a decrease in blood pressure in one, and a decrease in arterial oxygen saturation in the other case. The results were comparable in both phases of the study. Overall, oxygenation did not change significantly with position but tended to deteriorate in lateral steep compared with in the supine position. Interindividual changes of oxygenation with different positions showed a high variability. Tidal volumes and lung compliance declined significantly in the lateral steep position compared with the supine position. The cardiac index and arterial blood pressure did not change significantly. The beneficial effects of CRT are not due to recruitment of nonventilated lung areas. Steep positioning itself does not improve oxygenation. Continuous turning seems to exert beneficial effects due to other mechanisms. CRT should be performed without prolonged pauses in the maximal steep position. Introduction In acute respiratory failure (ARF), in particular acute lung injury (ALI) and acute respiratory distress syndrome (ARDS), change from the supine position (SP) to the prone position (PP) can improve oxygenation by recruiting alveoli situated in dorsaldependent regions and by alteration of the ventilation/perfusion ratio. The efficacy of this intervention is shown by the course of oxygenation index. The aim of our study is to demonstrate different effects of prone position ventilation (PPV) in ARF in patients with different concomitant diseases. We studied 110 consecutive patients with ALI (n = 18) and ARDS (n = 92) at mean (± SE) age 66 ± 13 years in a clinical follow-up design at a surgical ICU in a university hospital using the American European consensus definition. Respiratory failure was accompanied solely or in combination by sepsis (n = 52), pneumonia (n = 66), malignant disease (n = 25), vascular disease (n = 60) or multiple trauma (n = 10) as concomitant disease. All patients were ventilated intermittently in SP and PP (135° leftside/right-side position) for at least 6 hours/day for supportive treatment of respiratory failure. Data collection included, apart from baseline characteristics, individual oxygenation index and concomitant diseases of the patients. Discussion Especially, patients with ARF and sepsis showed an early and lasting increase in oxygenation after starting PPV, while those with pneumonia, vascular disease or malignant diseases showed a visible increase in oxygenation on the first day after starting PPV, and a more or less clear improvement in the following days. Multiple trauma patients showed only a trend for a better oxygenation but not a significant increase. Introduction SARS is caused by a newly described coronavirus [1] . About 20% of SARS patients require oxygen supplementation and mechanical ventilation in an ICU for respiratory failure and ARDS [2] . We aim to describe the CT findings of patients in the late stage of ARDS caused by SARS, and report changes on longterm follow-up. Methods A retrospective review of CT findings in eight patients who met CDC criteria for SARS. All patients met criteria for ARDS [3] . CT was performed in late-stage ARDS (more than 2 weeks after onset of ARDS) [4] , and after discharge from hospital in survivors. Relevant respiratory and ventilatory parameters, total steroid dose and outcome were recorded. All mechanically ventilated patients received low pressure (peak pressure < 30-35 mmHg), low volume (tidal volume < 8 ml/kg estimated lean body mass) ventilation [5] . Five patients received prolonged mechanical ventilation (over 14 days), one was only ventilated for 72 hours, and two patients were not ventilated at all. All patients received high-dose pulse methylprednisolone (2.5-7 g total dose). Three patients died and five survived to hospital discharge. ARDS chronic stage CT findings Consolidation was present in five patients. Ground glass opacification and interstitial thickening were present in all patients. Three had evidence of fibrosis. Small pulmonary cysts were present in five patients and extrapulmonary gas (pneumothorax) in two. Findings in patients after long-term mechanical ventilation (more than 14 days) and short-term or no ventilation were similar. At follow-up CT (mean 3 months, n = 4), consolidation and extrapulmonary gas had resolved, ground glass opacification improved, but signs of fibrosis had generally progressed. The CT features of late-stage ARDS caused by SARS are similar to those seen in late-stage ARDS from other causes [4] , with no apparent differences between patients who received prolonged mechanical ventilation and those who did not. Fibrotic changes seen in the chronic phase of ARDS do not seem to resolve significantly after discharge. Methods We retrospectively reviewed the case records of all probable SARS patients admitted to a designated national SARS ICU from 1 March to 13 July 2003 when the last SARS patient was discharged. Results One hundred and ninety-nine probable SARS patients were admitted to this national SARS hospital. Forty-six (23.1%) required ICU admission. The mean age of these ICU patients was 49.8 ± 16.4 years with equal sex distribution and similar racial distribution as the national population. The mean APACHE II score was 19.2 ± 9.6. The median (interquartile range [IQR]) PaO 2 /FiO 2 ratio on ICU admission was 88 (60-128). Eighty-five per cent required mechanical ventilation. Complications observed included: septicaemia 16 (34.8%), secondary pneumonia 24 (52.2%), deep vein thrombosis 11 (23.9%), acute renal failure 9 (19.6%), acute myocardial infraction 8 (17.4%), stress hyperglycaemia 8 (17.4%), pneumothorax 8 (17.4%), pulmonary embolism 7 (15.2%), cerebrovascular accidents 4 (8.7%) and pneumomediastinum 1 (2.2%), The in-ICU mortality was 54.3% (25/46). The median (IQR) length of mechanical ventilation was 12 (9-22) days. The median (IQR) lengths of stay in the hospital and ICU were 23.5 (15-36) days and 14.5 (7-22) days, respectively. Cox regression analysis showed that the male sex (P = 0.03), an APACHE II score > 15 (P = 0.003) and a history of congestive cardiac failure (0.017) were independent predictors for mortality among SARS ICU patients. Conclusion About one in five probable SARS patients required ICU care. This group of critically ill SARS patients has high mortality and morbidity. The predictors for ICU mortality were male sex, APACHE II score > 15 and history of congestive cardiac failure. Objective This study aimed to describe patients with SARS who developed respiratory failure requiring ICU admission. Methods Retrospective analysis of prospectively collected demographic, clinical, biochemical, microbiological and radiological data on SARS cases admitted to an ICU. Results A total of 116 patients, including 10 health care workers, were admitted to the ICU with a clinical diagnosis of SARS. In 62% of cases, SARS Co-V was positive. The mean time interval between symptom onset and hospital admission was 4.8 ± 3.7 days and the mean time between symptom onset and ICU admission was 10.5 ± 4.7 days. The 116 patients had a mean ± SD age of 48.2 ± 14.6 (range 24 to 96) years. Sixty-eight percent had known contact or exposure to another SARS person. Significant comorbidity was present in 36%. Treatment consisted of broad-spectrum antibiotics, ribavarin, corticosteroids (maintenance and pulse), and in selected cases kaletra, immunoglobulin, convalescent serum and Chinese herbal medicine. The median Acute Physiology and Chronic Health Evaluation II score was 19 (range 6-49). The admission mean ratio of the partial pressure of oxygen to the fraction of inspired oxygen was 146 ± 80 (range 60-421). Seventy-six patients required mechanical ventilation and the mortality in this group was 53%. No spontaneously breathing patients died while in the ICU. Of the ventilated patients, 28% developed barotauma. In the nonventilated patients, 10% developed barotrauma spontaneously. Renal failure occurred in 78% of nonsurvivors and 8% of survivors. Positive blood cultures occurred in 37% of nonsurvivors and 8% of survivors. The median duration of stay in the ICU was 13 days (range < 1 to 111 days). The overall mortality was 34.5%. Conclusion SARS carries a high mortality especially in mechanically ventilated patients. Barotrauma, renal failure and sepsis were prevalent. Introduction Acute lung injury (ALI) was found to be a frequent complication of acute liver failure (ALF) in a previous study [1] . The associated mortality was high (89%) and severe hypoxia is a contraindication for liver transplantation (LT) in many transplant centers. We tried to study the incidence and outcome of ALI in a heterogeneous group of patients with ALF. In addition, we aimed to investigate the role of extravascular lung water index (EVLWI) measurements in diagnosis and management of ALI in liver failure. Methods A retrospective chart review of 40 patients with ALF fulfilling poor prognostic criteria admitted to a liver ITU over an 18-month period. ALI findings on chest X-ray (CXR) around the period of listing and perioperatively were correlated with oxygenation and ventilator parameters as well as measurements of EVLWI and the permeability index (EVLWI/intrathoracic blood volume index measured via transpulmonary thermodilution; PiCCO Pulsion, Munich, Germany). The cause of ALF was paracetamol overdose in 19 patients, NANB hepatitis in eight patients, and acute Budd-Chiari syndrome in five patients. The remaining eight patients suffered from acute hepatitis B virus infection, Wilsons disease, drug-induced liver failure and autoimmune hepatitis. Twenty-eight patients (70%) were transplanted and 24 survived to hospital discharge. Of the 12 nontransplant patients only one survived. A diagnosis compatible with ALI was found in 23%, with no difference between transplant and nontransplant patients. The median PaO 2 /FiO 2 ratio (P/F) was 167 in ALI patients against 283 in nonlung injury patients (Mann-Whitney U test, not significant). Only two patients (5%) were taken off the transplant list due to a combination of refractory hypoxia and multiple organ failure. Transplant and nontransplant patients did not differ in terms of P/F, positive end expiratory pressure (PEEP), EVLWI or permeability index. Poor outcome was not associated with low P/F, CXR signs of ALI, high EVLWI or permeability index. Bilateral infiltrates on CXR correlated with PEEP (P = 0.004), EVLWI (P = 0.001) and permeability index (P < 0.001) but not with P/F. Focal lung infiltrates did not correlate with the above parameters. Twelve patients without CXR findings of ALI had significant impaired gas exchange P/F < 200, suggesting an extrapulmonary reason for hypoxia. Conclusion One-quarter of the patients with ALF in our series showed signs of ALI. Severe lung injury in the peri-transplant period was not associated with a high mortality. Hypoxia is nonspecific for the diagnosis of ALI and should not be used as a sole determinant to remove patients from the transplant list. The EVLWI and the permeability index are readily available bedside parameters for the diagnosis of ALI; however, they did not correlate with patient outcome. Objectives To determine the incidence of pulmonary dysfunction in comatose patients with severe traumatic brain injury and to evaluate the effect of respiratory failure on neurological outcome. Methods A retrospective study of 68 trauma patients, 18 females (26.47%) and 50 males (73.53%), admitted to the ICU from July 1998 to July 2003 with isolated brain injury and a Glasgow Coma Score (GCS) of 9 or less, all under sedation and mechanical ventilation on admission. Patients enrolled in the study were aged from 16 to 83 years, with a mean age of 40.28 ± 20.73 years. Pulmonary function was evaluated at the latest by the fourth day of hospitalization, using the Lung Injury Score (LIS) with the following components: chest radiographic findings, PaO 2 /FiO 2 and positive end expiratory pressure. A LIS of zero was defined as normal pulmonary function, a LIS of 0.1-2.5 as mild to moderate dysfunction, while a LIS of more than 2.5 as severe pulmonary dysfunction. The maximum LIS was taken in consideration for statistical analysis. The patients' brain computerised tomography (CT) scan on admission was graded using the Marshall CT scoring system. The Glasgow Outcome Scale (GOS) was used after a 30-day follow-up of the population, as good recovery (1), moderate disability (2), severe disability (3), vegetative state (4), death (5)considering good outcome as (1) and (2) and bad outcome as (3), (4) and (5) . The mean value ± SD was calculated for all data. The correlation of all variables was assessed and the outcome was evaluated. The GCS on admission was 6.47 ± 1.89. The maximum LIS was recorded before the fourth day of hospitalization (3 ± 1) with a mean value of 3.34 ± 0.86. Six patients (8.8%) had normal pulmonary function, while 59 (89.7%) had mild to moderate dysfunction and one patient (1.47%) had severe pulmonary dysfunction. Based on the GOS 47.1% (32/68) had good outcome and mortality reached 25% (17/68). Moderate statistical correlation was found between the LIS and GCS (correlation coefficient r = -0.51), and also between the LIS and GOS (r = 0.52), while the GCS and GOS were inversely related (r = -0.78). Sixty-one patients (89.7%) presented a relation between LIS and CT score (r = 0.64). The remaining nine patients (13.23%) did not show a relation between LIS and CT score, as aspiration prior to intubation, atelectasies and mechanical ventilation with overdistension of the lung aggravated pulmonary function directly and independently from the neurological status. Conclusions Although in clinical practice we experience a strong relation between LIS and CT scan findings for patients with severe traumatic brain injury, this was evident in 89.7% of the population studied. Pulmonary dysfunction was not an isolated complication in patients with brain injury showing poor prognosis, but rather represented the first manifestation of a systemic disease. A broader study should be performed in order to assert these findings. Background Smoke inhalation injury is a frequent health threat contributing to significant pulmonary derangements. The objective of this study was to determine potential interdependencies between pulmonary shunt fraction (Qs/Qt), tissue oxygenation and cardiac performance in the acute phase of combined burn and smoke inhalation injury. Methods Following a 20%, third-degree burn, sheep (n = 10) were subjected to cotton smoke according to an established protocol (4 × 12 breaths, < 40°C). Before (BL) and after each set of cotton smoke inhalation, the Qs/Qt, mean pulmonary arterial pressure, cardiac index (CI), left ventricular stroke work index (LVSWI), O 2 delivery index (DO 2 I), O 2 consumption index (VO 2 I) and O 2 -extraction rate (O 2 -ER) were assessed. At the same time points, the arterio-venous SO 2 difference and veno-arterial pCO 2 gradient were determined. Statistics One-way ANOVA with Student-Newman-Keuls posthoc test, or linear regression when appropriate. Data are expressed as mean ± SEM, or Pearson correlation coefficient, respectively. The burn injury led to an immediate and sustained decrease in CI (6.1 ± 0.2 vs 3.8 ± 0.6 l/min/m 2 ), accompanied by a depression in LVSWI (75 ± 2 vs 28 ± 1 g/m/m 2 ; P < 0.001 each). Smoke inhalation did not further impair these hemodynamic changes but led to a progressive increase in the Qs/Qt, up to a maximum of 85 ± 2% (P < 0.001 vs BL). Interestingly, the degree in Qs/Qt was inversely correlated with both the arterio-venous SO 2 difference (r = -0.90) and the veno-arterial pCO 2 gradient (r = -0.57). While the O 2 -ER remained unchanged, a more than 50% reduction in both DO 2 I (724 ± 41 vs 345 ± 43 ml/min 2 /m 2 ) and VO 2 I (295 ± 16 vs 123 ± 9 ml/min 2 /m 2 ) occurred (each P < 0.001 vs BL). Conclusion This study demonstrated that in the acute phase of combined burn and smoke inhalation injury (1) the deterioration in tissue oxygenation is linked with depressions in myocardial contractility and global oxygen transport as well as an increase in the Qs/Qt, and (2) the arterio-venous SO 2 difference is more appropriate to mirror ventilation/perfusion mismatch compared with the veno-arterial pCO 2 gradient. We examined expired air condensates (EAC) of patients with various stages of ARDS using spectrophotometry to study NO metabolites and fluorescence to study H 2 O 2 . Identification of CD95 (FAS) in blood cells was detected using monoclonal antibodies. Our investigation showed the increase of NO metabolite level in EAC patients with the first and second stages of ARDS and its significant decrease in EAC patients with the third and fourth stages. The level of H 2 O 2 in EAC elevated with the progress of ARDS. All these patients have increased FASL production, but we noted a reduction of the CD95 level in patients with the final ARDS stage. These data show that alveolar epithelial injury in ALI or ARDS is in part associated with upregulation of the FAS/FASL system and activation of oxidative stress markers. Our findings suggest that oxygen-independent pathways may mainly operate in the process of FAS-induced apoptosis. Introduction The addition of high levels of PEEP in patients under mechanical ventilation can be part of recruitment maneuvers and alveolar protection. However, it can also cause deleterious effects in the right ventricular function. We studied 16 patients admitted in a trauma ICU during the period January-August 2003. All patients were under mechanical ventilation and without significant pulmonary disease. They were submitted to the systematic increase of PEEP from ZEEP to 5, 10, 15, 20 and 25 cmH 2 O every 10 min. At the same time, we measured the haemodynamic effects and alterations in the oxygen transportation. Results All were traumatic patients, without significant pulmonary disease. The mean age was 34.3 ± 11.9 years, 87.5% were male, the mean APACHE II score was 26.4 ± 4.5, the mean SAPS II score was 44.9 ± 11.9 and the initial mean SOFA score was 5.8 ± 1.4. The addition of PEEP caused progressive increase of the pulmonary artery pressure, of the central venous pressure (CVP) and of the oxygen extraction rate. At the same time, it caused progressive decrease of the cardiac index (CI) and the right ventricular systolic work index (RVSWI). The CVP increased from 13.3 ± 5.5 to 21.8 ± 4.4, the CI decreased from 5.0 ± 1.6 to 4.1 ± 2.1 and the RVSWI decreased from 10.6 ± 6.2 to 7.0 ± 5.9. All these alterations were statistically significant (P < 0.05). Conclusions The addition of PEEP in patients under mechanical ventilation and without significant pulmonary disease can cause progressive right ventricular dysfunction. Haemodynamic care and support should be taken during PEEP increases for recruitment maneuvers and alveolar protection, even if this is done for short periods of time. Introduction Tidal volume limitation and incremental positive end expiratory pressure (PEEP) correlates with improved survival in acute respiratory distress syndrome (ARDS) [1] , but the impact on haemodynamic response is not well established [2, 3] . Aim of study To evaluate (1) the haemodynamic and volumetric response to incremental changes of tidal volume and (2) the haemodynamic and volumetric response to incremental changes of PEEP. Twenty acute lung injury (ALI)/ARDS patients were mechanically ventilated and connected to an integrated monitoring system (PICCO system; Agilent) by a fiberoptic arterial catheter (pv 2014L16) and a central venous catheter. All patients were randomised to receive mechanical ventilation with an incremental tidal volume (TV) (6, 8, 10 and 12 ml/kg at ZEEP) (group A) or TV = 6 ml/kg and incremental PEEP (5, 10 and 15 cmH 2 O) (group B). At every change of respiratory parameters the main haemodynamic and volumetric data were evaluated. Cardiac output (CO) was evaluated continuously online. All data are expressed as the mean ± SD. The ANOVA test was used to compare changes at different TV and at incremental levels of PEEP. P < 0.05 was considered statistically significant. Available online http://ccforum.com/supplements/8/S1 Table 1 Group A P value CO (l/min/m 2 ) TV 6 vs TV 10-12 ml/kg < 0.01 MAP (mmHg) TV 6 vs TV 10-12 ml/kg < 0.001 The detrimental hemodynamic effects of positive pressure ventilation in hypovolemic states have been well described for more than 60 years. Nevertheless, typical paramedic training programs and applicable prehospital resuscitative trauma protocols in the United States and elsewhere often call for 'hyperventilation' therapy (e.g. rates > 15/min), particularly in moribund trauma patients. Typically, paramedics are trained to do so with the anecdotal rationale that they will 'pump in more oxygen' and better 'compensate for metabolic acidosis'. Therefore, considering that many of the patients receiving assisted ventilation are those with the most severe hemorrhage, there exists concern that such practices may be an under-recognized contributor to worse outcomes and that they may even over-ride or mask the potential positive effects of any resuscitative study interventions. To begin to address this issue, a study was performed to demonstrate that very slow respiratory rates (RRs) not only preserve adequate oxygenation and acid-base status in hemorrhagic states, but also that 'normal' or higher RRs worsen hemodynamics, even in cases of mild to moderate hemorrhage. Methods Eight pigs were ventilated with 12 ml/kg tidal volume (intubated, no positive end expiratory pressure, no lung disease); 28% FiO 2 ; and RR = 12/min. The pigs were then hemorrhaged to < 65 mmHg systolic arterial blood pressure (SABP). After reaching steady state, RRs were then sequentially changed every 10 min to 6, 20, 30, and 6/min, respectively. Results With RRs at 6/min, the animals maintained pH > 7.25/SaO 2 > 99%, but increased mean SABP (65-84 mmHg; P < 0.05), time-averaged coronary perfusion pressure (CPP) (50 ± 2 to 60 ± 4 mmHg; P < 0.05) and cardiac output (Qt) (2.4-2.8 l/min; P < 0.05). With RRs = 20 and = 30, the SABP (73 and 66 mmHg), CPP (47 ± 3 and 42 ± 4 mmHg) and Qt (2.5 and 2.4 l/min) decreased, as did PaO 2 and PaCO 2 (< 30 mmHg) with P < 0.05 for each comparison, respectively. When RR returned to 6/min, SBP (95 mmHg), CPP (71 ± 6 mmHg), Qt (3.0 l/min) improved significantly (P < 0.05). Conclusions Following moderate hemorrhage, animals can maintain adequate oxygenation and ventilation with very slow RRs, while increasingly higher RRs progressively impair hemodynamics, directly resulting in diminished coronary perfusion and Qt. It is probable that these effects will be more pronounced in severe hemorrhagic states and may even contribute to worse outcomes and under-appreciated compromised study results. Current resuscitative protocols for trauma involving provision of positive pressure ventilation should be re-examined. Introduction Volumetric monitoring with right ventricular enddiastolic volume indexed (RVEDVi) and global end-diastolic volume indexed (GEDVi) is increasingly being suggested as a better preload indicator than traditional intracardiac filling pressures (central venous pressure [CVP] and pulmonary artery occlusion pressure [PAOP] ). Static volumetric monitoring, however, has not consistently been shown to correlate with the cardiac index (CI). This study aims (1) to evaluate the influence of ejection fraction (EF) on RVEDVi and GEDVi, (2) to study the effect of RVEDVi and GEDVi changes (∆) corrected for right ventricle ejection fraction (RVEF) and global ejection fraction (GEF), respectively, on ∆CI, and (3) to identify optimal resuscitation target volumes. Complete hemodynamic profiles with the VoLEF and PiCCO catheter (Pulsion, Germany) were obtained in five mechanically ventilated medical ICU patients: age 62.6 ± 11.2 years, SAPS-II 41 ± 15.4, APACHE II 18.8 ± 5.4, and SOFA 4.8 ± 2.1. In total 125 paired measurements of CVP, PAOP, RVEDVi, RVEF, GEDVi, GEF and CI were performed. Figure 1 shows the correlation of ∆RVEDVi corrected for RVEF (∆RVEDVi-C) with ∆CI (R 2 = 0.69, P < 0.0001). Figure 2 shows a similar correlation between ∆GEDVi corrected for GEF (∆GEDVi-C) with ∆CI (R 2 = 0.62, P < 0.0001). Adjustment of volumes was achieved by exponential correction for the EF deviation from normal (Fig. 3) . The RVEDVi-C and GEDVi-C volumetric targets corrected for EF are presented in Table 1 . We did not found any correlation between (∆) filling pressures or (∆) static volumes and (∆) CI. Conclusion Filling pressures and static volumes are unreliable preload indices. The EF corrected target volumes better predict fluid responsiveness. We suggest using RVEDVi-C and GEDVi-C as target resuscitation or restoration of organ perfusion endpoints. Unneccessary over-resuscitation past these values will not benefit the patient. Table 1 Ejection fraction 10% 20% 30% 40% 50% RVEDVi Background The pressure recording analytical method (PRAM) is a recently developed method for beat-to-beat quantification of cardiac output (CO) based on the analysis of the arterial waveform. Since PRAM can be implemented in various conditions of flow, we assessed its accuracy in cardiac surgery during extra-corporeal circulation (ECC), using the roller pump device (RP) as the reference gold standard. In 25 patients undergoing cardiac surgery, CO values obtained by the PRAM from the radial artery were compared with bolus thermodilution (TD) before and after ECC, and with the RP readings during ECC before and after aortic clamp. PRAM flow measurements during ECC were based on the analysis of each sinusoidal arterial waveform (the continuous component of flow) produced by the RP. The estimates of blood flow measured by the PRAM closely agreed with TD (r 2 = 0.75; P < 0.0001; bias = 0.07 ± 0.40; coefficient of variation < 2%) and simultaneous RP readings (r 2 = 0.71; P < 0.0001; bias = 0.11 ± 0.33; coefficient of variation 2.5%). During weaning from ECC, two patterns of hemodynamic adaptation were documented by PRAM following resumption of cardiac contraction. Background The N-terminal prohormones of the natriuretic peptides ANP and BNP (NTproANP and NTproBNP) are accepted markers of myocardial dysfunction [1] and are increasingly used in critically ill patients. While it is well recognized that plasma levels of ANP are altered by variations in thoracic blood volume and during intravenous sodium loading [2] , only incomplete data are available on the course of BNP during these interventions [3] . Two groups of eight healthy subjects were randomly tilted into a 15°-feet-down (FD) or a 15°-head-down (HD) position. Ten volunteers were crossover subjected to an infusion of 15 ml/kg NaCl 0.9% (over 60 min) or control during an observation period of 10 hours. Blood was sampled at timed intervals. NTproANP and NTproBNP were determined by radiochemiluminescence and electrochemoluminiscence immunoassays, respectively. Results NTproANP levels (in % of baseline levels) were higher (P < 0.05) during HD (124 ± 13%) than during FD (82 ± 6%), while NTproBNP levels were not affected by tilting. Following sodium loading (Fig. 1) , plasma NTproANP levels increased immediately and returned back to baseline after 8 hours. In contrast, NTproBNP levels increased 3 hours after infusion and had doubled at the end of the observation period. Conclusions Besides relevant physiological implications -a sequential increase of NTproANP and NTproBNP after sodium loading has not been shown previously in humans -our data show that NTproBNP levels are influenced by sodium infusion. This may be relevant for the interpretation of NTproBNP levels in critically ill patients. The course of plasma levels of NTproANP and NTproBNP in 10 healthy volunteers during the control protocol (᭺) and the sodium loading protocol (ᮀ). Data are mean ± SEM. *P < 0.05, between-group difference. § P < 0.05, versus baseline value. Friedman's test and Wilcoxon's matched pairs test as appropriate. Studies in surgical patients [1, 2] have suggested that using noninvasive measurements of cardiac output to guide intraoperative fluid resuscitation results in improved patient outcome. We tested this hypothesis in patients for major elective surgery. Methods Fifty patients scheduled for major urological surgery were sequentially randomized to a control (C) group or a flow guided (FG) group. In the C group, intraoperative fluids were given according to blood pressure, heart rate and blood loss. In the FG group, in addition to the above, a volume was given to maximize the cardiac output or until a cardiac index of > 3.0 l/min/m 2 was achieved. We used a NICO (Novametrix) noninvasive cardiac output to record continuous measurements of cardiac index. Patients were followed postoperatively by the investigator team, who recorded the return of bowel function and the time to hospital discharge. The decision to discharge patients were made by the surgical team per protocol. Results Patient demographics are presented in Table 1 . Patients in the FG group were older than in the C group. Patients in the FG group were given significantly more volume/hour of surgery than those in the C group. The time to passage of flatus, the ability to tolerate a soft diet and hospital discharge were all decreased in the FG group (Table 2 ). There was no difference in morbidity or mortality between the two groups. Conclusion This study sugests that the use of continuous noninvasive measurements of cardiac output to guide intraoperative volume replacement results in more volume being given but in a faster postoperative recovery of bowel function and faster hospital discharge. Introduction Central venous oxygen saturation (ScvO 2 ) reflects both tissue oxygen delivery and consumption and has been shown experimentally to closely reflect a range of circulatory disturbances. This and related parameters have been used successfully to improve outcome in various patient groups. Despite this fact, there is little or no observational data to link derangements of these parameters to outcome in the high-risk surgical patient. Methods Data was collected on high-risk surgical patients for the first eight post-operative hours. Routine monitoring included cardiac output and central venous saturation. Patients were divided into high and low ScvO 2 groups according to whether the ScvO 2 fell below 65% for two consecutive hourly readings and were followed up for 28 days. Results Seventy-nine patients were enrolled. There were eight deaths (10.1%) and 93 morbidity episodes (1.2 episodes per patient). The median age was 67 years (31-86 years). The median APACHE II score was 9 (2-23). Trends in ScvO 2 and cardiac index indicated circulatory failure when many routine parameters were normal (e.g. blood pressure, heart rate and serum lactate). A higher incidence of complications in particular septic and cardiac complications, a longer hospital stay and a trend towards increased mortality were found in patients with low ScvO 2 (see Table 1 ). Low ScvO 2 appears to indicate patients at high risk of complications after surgery. ScvO 2 may indicate hypovolaemia at an earlier stage than traditional parameters. ScvO 2 may be a suitable haemodynamic goal in surgical patients, as has been shown in severe sepsis. Objective This pilot investigation describes two EGDT implementation models: (1) the medical intensive care unit (MICU)-initiated model; and (2) the emergency department (ED)-initiated model. This is a prospective, descriptive study reporting a qualitative assessment of the advantages of, and relative barriers for, two models of EGDT implementation. The models were developed by two tertiary care hospitals. Patients eligible for the EGDT protocol were those with suspected infection criteria and either septic shock (systolic blood pressure < 90 despite 30 cm 3 /kg fluid resuscitation) or lactate (> 4 mmol/l). Implementation at each center included a 1-month initial phase comprised of MICU and ED physician and nursing staff education. The MICU-driven protocol conducted a 2-month baseline data collection period. Results Both models are collaborative, but vary in terms of initiation of the protocol. The MICU-driven model developed and introduced a protocol with ensuing support from the ED. The EDdriven model developed and introduced a protocol with ensuing support derived from the MICU. Timing of patient identification, protocol compliance and ED transfer time to the MICU are continuously evaluated in both models. Insertion of an ScvO 2 central venous oximetry catheter (PreSep) is shared between the MICU and ED teams based on ED acuity, operator skill, and physiologic urgency. The MICU-driven protocol is strongly supported by the intensive care staff; early transfer from the ED to the MICU is a priority. During the course of this study, timing of patient identification and MICU/ED collaboration continues to improve. The ED-driven protocol team has found that early identification and initiation of the protocol, in combination with strong support from the ICU, results in collaborative streamlining based on ED and MICU workloads. Both institutions have found that challenges in patient identification and line insertion present the most time-consuming obstacles to protocol initiation. Conclusion Implementation of EGDT protocols is possible through a collaborative process, whether driven by intensive care unit or emergency department teams. Frequent education and collaborative communication appear to be key elements for success. The peripheral perfusion index (PFI), derived from the pulse oximetry signal, permits a quantitative analysis of the peripheral perfusion in patients. However, the relationship between variations of this index and variations in peripheral blood flow in critically ill patients has not been studied. We therefore studied the ability of the PFI to detect the vascular response produced by induced reactive hyperemia in critically ill patients. Materials and methods PFI was obtained using a conventional pulse oximetry. The peripheral temperature was measured using a proximal temperature probe (Phillips 21078A) placed on the forearm and a distal probe placed on the middle fingertip of the same arm used for PFI measurements. The skin-surface temperature gradient between the proximal and distal probe (dT) was calculated as a measure of changes in forearm bloodflow. Reactive hyperemia was produced by arrest of forearm blood flow with a sphygmomanometer pneumatic cuff placed around the upper arm not used for arterial cannulation. The cuff was inflated to a pressure approximately 30 mmHg greater than systolic pressure for 3 min, after which the cuff was rapidly deflated to 0 mmHg. The hyperemic response was analysed by measuring the maximum increase in PFI relative to baseline PFI. A student's t test was used to assess the differences at baseline and maximum PFI. P < 0.05 was considered statistically significant. Results Fifty-six measurements were carried out in 14 patients. In all patients the PFI increased after release of the pneumatic cuff. The changes in hemodynamics, PFI and dT are presented in Table 1 . Conclusions PFI can be used to assess vascular reactivity in critically ill patients. The primary goal in the treatment of patients with shock is to restore the tissue perfusion and oxygenation. Clinical endpoints of resuscitation (i.e. bloodpressure, heart rate, urine output and oxygen saturation) give an incomplete or even misleading picture. Tissue oxygen tension (ptO 2 ) reflecting tissue oxygenation may be a useful endpoint; however, in the clinical setting ptO 2 measurements subcutaneously (ptO 2 sc) and in muscle (ptO 2 im) are both used, although a comparative study has never been performed. In 18 critically ill patients ptO2sc and ptO2im were simultaneously and continuously measured using polarographic Clark-type electrodes (LICOX Catheter Measurement System, GMS), placed subcutaneously and in the m. biceps brachii of the upper arm. Eighteen men and two women with septic shock (n = 8), severe sepsis (n = 5), sepsis (n = 1), polytrauma (n = 3) and nonseptic acute respiratory distress syndrome (ARDS) (n = 1) were included. The median age was 60 years (range 22-80 years), and the median APACHE score on admission was 18 (range 10-31). The median duration of tissue oxygen measurements was 5 days (range 2-8 days). Median ptO 2 values of different groups are presented in Table 1 . Wilcoxon's signed rank test was used to compare changes within the group and a one-way analysis of variance (ANOVA) was used for comparison between the variation of ptO 2 values within groups. Although differences between ptO 2 sc and ptO 2 im were found in individual patients and between goups, a clear pattern could not be established. Measurement of tissue oxygenation can be performed in subcutaneous tissue as well as in muscle tissue. We compared the efficacy of transcutaneous monitoring of CO 2 (tcCO 2 ) with traditional PaCO 2 from arterial blood-gas samples. Method We studied consecutive ICU admissions during an 8-day period. We included all new patients aged 18 and over who were expected to survive for 24 hours. A TOSCA transcutaneous monitor (tcCO 2 ) was attached to the earlobe. The probe was moved every 4 hours and 4-hourly routine arterial blood-gas samples were taken and PaCO 2 values recorded. We studied eight patients (four males and four females). The mean age was 56 years. The mean admission APACHE II and SAPS scores were 19.5 and 31.3, respectively. The mean hospital stay was 31 days. ICU survival was 75%. Two patients were ventilated using a high-frequency oscillator. A total of 64 comparisons were made. We detected a significant difference between blood-gas PaCO 2 and TOSCA tcCO 2 of 0.43 kPa (paired t test P = 0.0012). Figure 1 shows the bias and limits of agreements. A regression line was fitted and the slope and intercept were highly significant. It is possible to predict PaCO 2 from tcCO 2 using the following regression equation: PaCO 2 = 2.7 + 0.5 × tcCO 2 . Conclusion Although there is a difference between PaCO 2 and tcCO 2 , it is possible to accurately predict the PaCO 2 level using tcCO 2 with moderate precision. tcCO 2 may be an adequate alternative to arterial blood gas sampling. Bland Altman plot of TOSCA data. Objective To demonstrate the relations of the plasma N-terminal pro-brain natriuretic peptide (NT-proBNP) concentration with the severity of heart failure, the underling heart disease and New York Heart Association (NYHA) functional classifications in patients with cardiovascular disease. Method Plasma NT-ProBNP concentrations of patients with cardiovascular disease (n = 65) and of normal controls (n = 16) were measured by nonextracted, enzyme-linked, sandwich immunoassay (Elecsys ® ) after admission. Constructive and functional parameters of heart were measured with echocardiography in 65 patients at the same time. The plasma NT-ProBNP concentrations of patients were significantly higher than those of normal controls. They also increased according to the severity of heart failure classified by NYHA functional classifications (P < 0.001). The plasma NT-ProBNP concentrations and left ventricular end diastolic Introduction Recently, less invasive cardiovascular monitoring with transpulmonary thermodilution with the PiCCO system using a central venous line and an arterial thermodilution catheter got increasingly popular. In the last issue of the Yearbook of Intensive Care and Emergency Medicine, the opinion was stated that a single cold saline injection is sufficient to adequately measure cardiac output and derived thermodilution parameters [1] . Since there are no reported investigations in the literature on the subject, we wanted to examine this hypothesis. We retrospectively examined the data of the PiCCO system (Pulsion, Munich, Germany) from 18 patients treated in our neurosurgical intensive care unit. Neurosurgical diagnosis was mainly severe subarachnoid hemorrhage; 10 patients were additionally diagnosed with systemic inflammatory response syndrome, acute respiratory distress syndrome or cardiac failure. Thermodilution measurements consisted of up to five single injections with a bolus of 20 cm 3 iced saline. Data was automatically stored on a laptop connected to the PiCCO system. From a raw data volume of 200 MB, the thermodilution measurements were extracted and analysed for repeatability. An analysis of variance (ANOVA) was performed to quantify the disagreement between single measurements. Results A total of 417 thermodilution procedures consisting of 1465 single bolus injections were analyzed for indexed cardiac output (CI), intrathoracic blood volume (ITBI) and extrapulmonary lung water (ELWI). The median difference between the lowest and highest value in a single series was 0.3 l/min/m 2 (CI), 80 ml/m 2 (ITBI) and 1 ml/kg (ELWI), respectively. Calculated from the withinsubject variance of the ANOVA, the 95% repeatability coefficient for two measurements of the same thermodilution sequence was 0,72 l/min/m 2 (CI), 270 ml/m 2 (ITBI) and 3.5 ml/kg (ELWI). This translates to 48% (CI), 180% (ITBI) and 87% (ELWI) of the measured parameters' normal ranges. According to our data, we disadvise the use of a single injection for the measurement of transpulmonary thermodilution. Especially for preload parameters, the mean of at least three repeated measurements reflects the patients status more appropriately. It remains a clinical judgement how much imprecision of measured data a clinician is willing to accept. This may vary in different situations and is, as a matter of principle, not solvable by statistics [2] . Introduction Cardiac output can be measured continuously by an invasive pulmonary artery catheter (PAC) or by noninvasive impedance cardiography. The purpose of this study is to determine how well the two methods correlate in ventilated patients. In the 18 patients with a cardiac output less than 9 l/min, the correlation coefficient was r = 0.89 and r 2 = 0.80. The cardiac output measured by impedance cardiography has good correlation with the PAC in patients with a cardiac output less than 9 l/min. With time, the correlation between the two methods decreases. Further studies are indicated to determine which method is best suited for ventilated patients. intestinal perfusion and oxygenation at PEEP levels above 10 cmH 2 O. Therefore we explored the effects of PEEP in an animal model of ITBV-guided volume loading. Methods Twenty anesthetized and ventilated pigs were studied. An ultrasonic flow probe was placed around the superior mesenteric artery, catheters were inserted into the femoral artery and mesenteric vein. Animals were randomly assigned to: group 1 = controls (n = 9), received crystalloids; and group 2 = ITBV (n = 11), received crystalloids and continuous colloid substitution to maintain ITBV at the baseline level. We analyzed the intrathoracic blood volume index (ITBVI) and the pulmonary artery occlusion pressure (PAOP) with respect to the stroke volume index (SVI) as the preload index. The extravascular lung water index with respect to the PaO 2 /FiO 2 ratio was also analyzed. We prospectively studied 10 patients with septic shock monitored with a pulmonary artery catheter and the COLD System (COLD-Z021 Pulsion Medical System, Munich, Germany). Measurements were performed every 8 hours from the study admission until 48 hours. The relationships between variables were analyzed by linear regression. Linear regression between ITBVI/SVI was r 2 = 0.30 (P < 0.0001) while the PAOP failed to correlate (r 2 = 0.03) (Fig. 1) . The EVLWI showed a significant correlation with PaO 2 /FiO 2 (r 2 = -0.23, P < 0.0001) (Fig. 2 ). Our results showed that ITBVI is a more reliable indicator than PAOP for preload assessment in septic shock patients. EVLWI confirmed to be a very interesting bedside lung edema index in this selected population. Method A retrospective study on a surgical intensive care unit. Forty-six patients were analysed. All patients were mechanically ventilated. In all patients a PiCCO (Pulse Contour Cardiac Output) catheter was inserted. Variables suxh as length of ICU admission, length of pressure controlled ventilation and ventilation settings, APACHE II score (acute physiology and chronic health evaluation) and EVLW were collected. The correlation between variables was evaluated with a linear regression model. The median APACHE II score of the included patients was 13, ranging from 7 to 33. All patients were mechanically ventilated in a pressure control mode with positive end expiratory pressure ranging from 5 to 15 and an inspiration pressure ranging from 15 to 40 cmH 2 O. We divided patients into two groups: one included patients with EVLW < 10 and one included patients with EVLW of 10 or more. In the group with EVLW ≥ 10, we saw a higher mortality rate compared with the group with EVLW < 10 (35.5% vs 23.8%). In both groups we found no correlation in length of ICU admission, length of mechanical ventilation or the APACHE II score with EVLW. Although we found an association between EVLW and mortality in surgical critically ill patients, we could not confirm a relation between EVLW and length of ICU admission and length of mechanical ventilation in the patient population we studied. Introduction Use of hemodynamic monitoring allows one to evaluate and follow-up patients with acute myocardial infarction (AMI), monitor the effect of treatment, and compare different treatment options. On the other hand this creates additional problems, such as how to choose the method for monitoring that should be cheap, reliable and easy to use for the staff. Aim To evaluate the possibility of continuous hemodynamic monitoring application for patients with AMI and compare the results of two noninvasive methods -impedance cardiography (ICG) and transthoracic echocardiography (TTE) -in patients with AMI. Design A prospective study. Setting Kaunas University of Medicine, Clinic of Cardiology. Patients Patients with AMI, admitted within 12 hours after the onset of disease. Methods A standard eight-electrode ICG was recorded. The average value of the stroke volume (SV) derived from the last 10 min of the ICG record (60 SV instantaneous values) was used for the comparison of ICG and TTE results. SVTTE was calculated as the difference of left ventricular end-diastolic and end-systolic volumes. A four-chamber and two-chamber view was used for SVTTE analysis. SVTTE was alternatively evaluated by measuring the flow velocity-time integral in the left ventricular outflow tract. Results Eighty-seven patients were investigated according to study protocol. The results of 74 patients were used for comparative analysis: 56 (75.7%) men and 18 (24.3%) women. The average age was 64.2 ± 14.9 years, body mass index (BMI) was 28.6 ± 3.9, and ejection fraction was 42.8 ± 10.6%. Comparing the values of SV derived from ICG and TTE, the calculated correlation coefficient (r) was 0.73. The correlation between the methods of ICG and TTE reached r = 0.79 in men, but only r = 0.32 in women. Week correlation between methods was observed when SV values were compared for patients with BMI > 29 or < 19 (r = 0.42), as well as SV measured by evaluating the flow velocity-time integral in the left ventricle outflow tract (r = 0.37). Conclusions Significant correlation of SV was observed between ICG and TTE. Noninvasive monitoring during AMI can be considered a reliable method for further application. The benefit of routine measurement of cardiac output after cardiac surgery is still discussed. Some studies found no benefit of routine right heart catheterisation [1] while others found a reduce length of stay [2] . Clinical prediction of cardiac output is poor after cardiac surgery [3] . Clinicians also do not really know what value of cardiac output is necessary after cardiac surgery. The aim of the present study was to verify whether the commonly accepted lower value of cardiac index of 2.2 l/min/m 2 [4] at arrival in the ICU was a good predictor of complications after cardiac surgery. Methods Seventy-three consecutive patients with a value of cardiac index lower than or equal to 2.2 were included in a prospective observational study. Right heart catheterisation was decided in the operating room if the patient had a complex surgery or a left ventricular dysfunction. Routine hemodynamic measurements were performed at arrival in the ICU, 2, 6 and 18 hours later and included arterial and mixed venous blood gases. Postoperative complications were defined as death, renal insufficiency, and need for prolonged mechanical ventilation. Results Fifty-one patients had no complication (Group 1), and 22 patients (Group 2) present a postoperative complication (including four deaths). No difference was found between the two groups for preoperative or intraoperative data (age, ejection fraction, EuroSCORE, length of bypass or aortic clamp) or for the blood lactate level, pH or base excess. At the arrival in the ICU and 18 hours later, hemodynamic parameters were similar between the two groups. Complicated patients had lower cardiac output and SVO 2 values 2 and 6 hours after the arrival in ICU. In the logistic regression analysis, a reduced cardiac index at 2 and 6 hours after admission in the ICU had the strongest independent predictive value for postoperative complication. Discussion A low cardiac output 2 or 6 hours after arrival in the ICU is associated with a high level of postoperative complication. Using right heart catheterisation allows early prediction and treatment of low cardiac output in order to prevent postoperative complication [2] . During microcirculatory failure in septic shock, the relationship between modifications in macro-circulation and micro-circulation may depend on the resuscitation. How is the microcirculation modified during cardiac output (CO) increase by fluid? Four septic shock patients (73 ± 12 years old) were investigated with orthogonal polarization spectral imaging (OPS) and with a Swan-Ganz catheter. The SAPS score was 60 ± 8, and norepinephrine was 0.6 ± 0.3 µg/kg/min. In three sublingual areas per patient, the number of vessels (small, medium, large) were quoted as no flow = 0, sludge = 1, moderate = 2, normal = 3, allowing one to compute a microcirculatory ratio for each vessel category. Systemic hemodynamic and OPS data were taken before and within 30 min after fluid bolus (5-7 ml/kg). OPS data presented as median (interquartile range), demographic and hemodynamic data presented as median ± SE. Only the number of small vessels/field tended to increase after fluid from 34 (15) to 55 (21) (P = 0.09). In Fig. 1 , all vessel categories increased the ratio (P < 0.01). The blood pressure did not increase after fluid challenge, whereas the CO increased by 39 ± 18% (P < 0.05; Fig. 2 ). All sizes of microvessels changed with fluid loading, with different patterns between small and larger vessels, going from recruitment to increased blood velocity. study was undertaken to test whether endotoxin administration to human volunteers can be used as a model to study the sepsisinduced increase in microvascular permeability. In healthy, nonsmoking volunteers, microvascular permeability was assessed before and 5 hours after the administration of endotoxin (2 ng/kg body weight, n = 8) or placebo (n = 8), by (1) transcapillary escape rate of I 125 -albumin (TER-alb); (2) venous occlusion strain-gauge plethymography to determine the filtration capacity (Kf); and (3) bioelectrical impedance analysis (BIA) to determine extracellular water (ECW) and total body water (TBW). Administration of endotoxin resulted in the expected increase of pro-inflammatory cytokines (TNF-α from < 0.015 to 856 ± 158 pg/ml, P = 0.002), accompanied by fever (maximum temperature 38.7 ± 0.3°C, P < 0.0001), flu-like symptoms and cardiovascular changes (heart rate from 63 ± 3 bpm at baseline to 91 ± 3 bpm at t = 5 hours, P < 0.0001; mean arterial pressure from 96 ± 3 mmHg to 79 ± 4 mmHg, P < 0.0001; and forearm blood flow from 3.7 ± 0.8 ml/min/dl to 6.8 ± 1.1 ml/min/dl forearm volume). All changes were significantly different from the control group. In the endotoxin-treated subjects all microvascular permeability parameters remained unchanged: TER-alb from 7.2 ± 0.6% to 7.7 ± 0.9% (P = NS); Kf from 5.0 ± 0.4 to 4.2 ± 0.4 (P = NS); and ECW/TBW measured by BIA from 0.42 ± 0.01 to 0.40 ± 0.01 (P = NS). Also, no significant changes appeared in the microvascular permeability parameters in the control group. Although endotoxin is frequently used as a model to study sepsis-associated effects, an endotoxin-induced increase in microvascular permeability in vivo could not be detected by three different methods. Endotoxin administration to human volunteers is not suitable as a model to study changes in microvascular permeability. Experimental studies using a peritonitis model of sepsis in rats have reported abnormalities in microvascular perfusion including increased stopped flow capillaries, increased fast flow capillaries and impaired regulation of blood flow. We have used a spectrophotometric functional microvascular imaging system to quantify capillary geometry, red blood cell flow and red blood cell hemoglobin oxygen saturation during sepsis. We found that while oxygen saturation at the entrance of the capillary bed was unaffected in sepsis, there was a significant drop in saturation at the venous end of capillaries with normal velocities. Although the fall in oxygen saturation correlated with the number of stopped flow capillaries, we could not determine from this data whether tissue oxygen consumption changed with sepsis, whether fast flow capillaries acted as functional oxygen shunts or whether any regions of the tissue were anoxic. To address these questions we constructed a computational model that simulated O 2 transport in a three-dimensional volume of tissue supplied by heterogeneously spaced capillaries with fast, normal and stopped flow. The model was based on our experimental oxygen transport data (capillary density, number of stopped flow capillaries, capillary red blood cell velocity and supply rate, and entrance oxygen saturation levels) and the model predicted oxygen consumption, tissue oxygen levels and oxygen transport in fast flow capillaries; data that could not be measured experimentally with current technology. Tissue oxygen consumption in the model was adjusted to yield the same venous end oxygen saturation values as measured experimentally. Two cases were modeled; average sepsis (AS) with 33% stopped, 33% normal and 33% fast flow, and extreme sepsis (ES) with 50% stopped, 25% normal and 25% fast flow. Simulations found approximately twofold and fourfold increases in tissue oxygen consumption for AS and ES, respectively. Average (minimum) tissue PO 2 decreased from 43 (40) mmHg in control to 34 (27) and 26 (15) mmHg in AS and ES, respectively. Clustering fast flow capillaries to increase flow heterogeneity resulted in only a slight decrease in minimum tissue PO 2 , to 14.5 mmHg. Although fast flow capillaries did appear to be oxygen shunts with higher venous oxygen saturations, they were a significant factor in preventing tissue anoxia during sepsis. Despite the contribution of fast flow capillaries, simulated tissue PO 2 values continued to fall as the degree of microvascular injury increased. The model predicts that microvascular oxygen transport abnormalities associated with sepsis expose the tissue to local regions of a hypoxic environment that may lead ultimately to significant changes in cellular function. Introduction According to modified Fick equation, the venoarterial PCO 2 gradient (∆PCO 2 ) is inversely correlated to cardiac output so that an increase in the ∆PCO 2 value can alert to the presence of low-flow states or at least to a mismatch between tissue blood flow and metabolism. In septic shock patients, however, the relation between cardiac output and ∆PCO 2 is quite complex due to possible distributive abnormalities of macrocirculatory and microcirculatory blood flow. The objectives of this study was to determine whether there is a good correlation between cardiac output and ∆PCO 2 in a group of septic shock patients and between ∆PCO 2 , base excess (BE) and arterial lactate concentration (LAC), these two latter already demonstrated as good predictors of mortality in critically ill patients. We retrieved 21 patients from our prospective collected data base from January to December 2000. APACHE II scores were calculated at admission and ∆PCO 2 , cardiac output, BE and LAC were recorded at the moment of pulmonary artery catheter insertion and 24 hours after. All four variables had a normal distribution in our sample. Correlation between ∆PCO 2 and the other three variables was performed, using the Pearson coefficient. The patients' age was 53.1 ± 13.9 years and the APACHE II score was 26.0 ± 7.4. ∆PCO 2 was not correlated with any of the other three variables. The Pearson correlation coefficient was -0.173 (P = 0.29), 0.291 (P = 0.07) and 0.08 (P = 0.66) for cardiac output, BE and LAC, respectively. Background and goal of study Monitoring of organ function is often crucial for guiding therapy in critically ill patients. Most recently, the indocyanine green plasma disappearance rate (ICG-PDR) has been suggested for assessment of liver function and a transcutaneous system has been clinically introduced and validated [1] . In this study, we analyzed the agreement between ICG-PDR measured with the recommended dosage (0.5 mg/kg) and a reduced dosage (0.25 mg/kg). We studied 16 critically ill patients (five female, 11 male) who underwent monitoring of ICG-PDR for clinical indication (LiMon, Pulsion Medical Systems, Germany). For each comparative measurement, in a random fashion, either 0.5 mg/kg or 0.25 mg/kg ICG were injected and followed by the corresponding dosage 60 min later. We analyzed 31 pairs of ICG-PDR measurements by applying the recommended dosage (0.5 mg/kg, ICG-PDR 0.5) and a reduced dosage (0.25 mg/kg, ICG-PDR 0.25). Respirator settings and dosages of vasoactive drugs remained unchanged during the study. No drugs that may influence hepatic blood flow were administered during the study period. There were no changes in fluid status and the central venous pressure was unchanged at the two time points. Results and discussions ICG-PDR0.25 was between 2.7 and 25.0%/min and ICG-PDR0.5 between 4.5 and 24.5%/min, respectively. Linear regression analysis revealed ICG-PDR 0.25 = 1.13 and ICG-PDR 0.5 = 0.66%/min (r = 0.95, P < 0.0001) with a mean bias of 1.0%/min (standard deviation 2.5%/min). The 15-min residual rates were also highly correlated (r = 0.92, P < 0.0001) with a mean bias of 0.3%. Conclusion A reduced dosage of ICG (0.25 mg/kg) is sufficiently accurate for transcutaneous measurement of ICG-PDR in critically ill patients. Introduction Indocyaninegreen (ICG) clearance can be measured with the LiMON ® device (Pulsion, Germany) and is expressed by the plasma disappearance rate (PDR) for ICG (normal value 18-25%) and the residual ICG after 15 min (R15, normal value 0-10%). In this study we investigated the correlation between PDR/R15 and IAP, SOFA score, and classic liver function tests in mixed ICU patients. Methods A total of 130 paired measurements were performed in 28 patients. The IAP was obtained using a balloon-tipped stomach catheter connected to an IAP monitor (Spiegelberg, Germany). The male/female ratio was 3/2, age was 58.2 ± 12.1 years, APACHE II score was 25.8 ± 15.7, SAPS II score was 44.4 ± 13.9, MODS was 6.4 ± 3, and SOFA score was 6.9 ± 3.6. The number of measurements in each patient was 4.6 ± 3.6. Calculation of correlation was performed with the Prism GraphPad™ software (version 2.00, 31 October 1995), and values are presented as mean ± SD. The values for IAP were 10 ± 4 mmHg (normal value 0-5 mmHg), PDR was 13.6 ± 8.4%, and R15 was 22.3 ± 19.8%. The correlation between IAP and PDR/R15 was poor although significant (R = 0.4), as was the correlation between PDR/R15 and the classic liver test (with R < 0.1): aspartate aminotransferase, alanine aminotransferase, lactate dehydrogenase, gammaglutamyl transferase, alkaline phosphatase, or venous NH 3 . From the socalled liver synthesis function tests, albumin, bilirubin, plasma cholinesterase levels and prothrombin time, only the latter had a good correlation. Neither platelets nor general hemodynamic parameters or lactate were well correlated. A significant and reasonable correlation was observed between PDR/R15 and SOFA score and the number of organ failures. Finally, the correlation between PDR and R15 was good (R = 0.8). Mortality was 57%, PDR was significantly lower (10.4 ± 5.7 vs 16.3 ± 6.6) in patients who died, while IAP (10.6 ± 3.9 vs 8.6 ± 3.6), SOFA score (13 ± 3.3 vs 7.7 ± 3.4) and number of organ failures (2.5 ± 1.1 vs 1.2 ± 0.9) were significantly higher. The number of measurement failures was 14% before and 3% after the software upgrade. Conclusions LiMON ® measurements are feasible at the bedside. There is no correlation between LiMON ® -derived parameters and the classic liver function tests except coagulation. Correlation with IAP was significant but poor. LiMON ® -derived parameters correlated with SOFA score and number of organ failures and give additional information. The PDR was lower in patients who died while IAP, SOFA and organ failures were higher. Introduction Ultrasound-guided catheterization of the internal jugular vein has proved to be of benefit especially for patients with specific problems such as those hospitalized in ICUs. Our purpose was to evaluate the usefulness of such a method compared with the landmarks method, when the former is performed by a senior intensivist trained in ultrasound for 1 year and recently taught the Doppler guidance method and the latter by experienced staff. Patients and methods A prospective randomized study was performed during a 3-year period, in the 12-bed multidisciplinary ICU of 'G. Gennimatas' General Hospital (1 November 2000-1 November 2003). One group was assigned to internal jugular vein cannulation by the landmarks method (control group) and the other with ultrasound guidance (ultrasound group). Sixty-six patients (17 women and 49 men) were examined. Thirty-five patients were submitted to catheterization with the landmarks method and 31 patients with ultrasound guidance. Internal jugular vein cannulation was successful in 29 cases (82.8%) in the control group and in 30 cases (96.7%) of the ultrasound group (P = 0.07). Carotid artery puncture occurred in two cases in the first group and in one case in the second (P = 0.80). Jugular cannulation was successful at the first attempt in 74.2% and 86.6% of cases in the first and second group correspondingly (P = 0.25). The average access time was longer in the control group (9.33 s vs 6.85 s in the ultrasound group). Conclusion Ultrasound guidance improved the success rate of jugular vein cannulation and reduced the number of failures and complications as well as the number of attempts, but the difference between the two methods was not significant. The familiarity of the operators with the method is probably a contributing factor and must be taken into account. In the United Kingdom the use of real-time ultrasound (US) is recommended for the insertion of central venous catheters (CVCs) into the internal jugular vein in adults [1] . This is associated with decreased complications, decreased failure rates [2] and is possibly faster than insertion by a standard landmark technique. Concerns, however, exist regarding the training implications, cost and usefulness of these recommendations. We evaluated the effectiveness of US-guided central line insertion. Methods Between 1 February 2003 and 1 August 2003 we prospectively collected data regarding the use of US for CVC insertion using the Sonosite180PLUS ultrasound machine at a 14-bed general intensive care unit. All junior doctors underwent both bedside and theoretical teaching. One junior doctor underwent formal training. Results Eighty-two US-guided CVCs were inserted, the majority by senior house officers (88%). Most (82%) were inexperienced with the use of US (less than 20 previous US-guided CVCs inserted), while most (81%) were experienced in non-US placement (greater than 20 previous non-US-guided CVCs inserted). Eighty-five per cent of CVCs were inserted into the internal jugular veins and 15% inserted into the femoral veins. There was a 4% failure rate. Sixtythree per cent of CVCs were inserted on the first attempt, and 21% were inserted on the second attempt. The average duration of insertion was 18 min, with 57% taking less than 15 min. There was a 10% complication rate (excluding threading of guide wire-related problems). All were minor. Conclusions US-guided CVC insertion by inexperienced junior doctors with minimal training is not only feasible, but also appears to decrease complication and failure rates as well as increasing rates of first pass insertion. Objective Sepsis is a leading cause of mortality in intensive care units (ICUs). Septic patients have better prognoses when multiple organ dysfunction syndrome is not present. Regarding the heart, the variability of the R-R interval (HRV) also depends on the coupling between the heart and other organs; sepsis may be a readily available tool for evaluation of nonlinear dynamic relationships among organ and multiple organ dysfunction syndrome development. We conducted a prospective study to analyze HRV, hemodynamic, ecocardigraphic, and serum cardiac markers (troponin and creatin phosphate kinase), and to evaluate its possible relation with outcome. Design A prospective observational analysis of serum of patients meeting the criteria for septic shock. Setting A 28-bed medical-surgical ICU in a university hospital. Patients Twenty-five patients were analyzed in the study. We selected all consecutive patients who met the criteria for septic shock in our ICU and we collected blood samples for analysis on days 1, 3, 6, 9, 12 or until death. We analyzed creatin phosphate kinase total (CPK) and MB (CK-MB), and also analyzed troponin. We analyzed hemodynamic parameters by pulmonary catheter, cardiac ultrasonography and Holter recording for 24 hours on the time points days 1, 3, 6, 9, and 12. Statistical analysis All results are presented as the mean and standard deviation. For analysis we divided patients into survivors and nonsurvivors up to hospital release. We performed an ANOVA for repeated measurements in continuous variables, and correlation coefficients were determined according to multiple-level regession analysis. P < 0.05 was considered significant. Our mortality was 60%. We had 15 patients in the nonsurvivor group and 10 patients as survivors. The two groups had similar APACHE II scores (nonsurvivors 26; survivors 24 ± 5; NS). CPK, CK-MB, ecocardiography analysis, cardiac output, and vascular resistance did not show any significant difference at any moment during the study period. However, troponin showed a significant difference from the first day of study, followed by the stroke work analysis as a significant difference between survivors and nonsurvivors. The difference in stroke work data become higher from day 3 onwards. The HRV showed a significant difference in maximal and minimal low frequency (LF) and in maximal high frequency. There was correlation as an independent variable only for maximal LF as a predictor of patient outcome. Conclusions In our study, HRV showed the capability to prognose patient outcome. Heart dysfunction was detected only by serum troponin levels and the hemodynamic data: stroke work. Introduction Whether the day and time of ICU admission or discharge impacts patient care and outcome has been the focus of recent inquiry. There may also be a relationship between timing and outcome for events occurring within the ICU. As a first step to explore this issue, we aimed to describe the timing of new acute hemodynamic events in a large ICU population. Hypothesis Acute hemodynamic events are equally distributed throughout the day and throughout the week. Although there was a trend to less events in the early morning and evening, event frequency generally varied little by hour of the day (Fig. 1) . Distribution of events throughout the day. and data from the hospital stay were analyzed. The majority of patients (n = 255) did prove to have an ischemic heart pain (group A), 59 of them developed MI and 196 had angina and were treated accordingly. The rest of the patients could be divided into two groups: patients with no heart condition (group B, n = 62) discharged as musculoskeletal pain (n = 54), cholecystitis (n = 3), gastritis (n = 2), depression (n = 1), sepsis (n = 1) or meningitis (n = 1); and patients with cardiovascular disease but no actual ischemia (group C, n = 46) discharged as heart failure (n = 18), hypertension (n = 16), tachycardia (n = 5), bradycardia (n = 3), dissection of aorta (n = 2) and pulmonary embolism (n = 2). There were no statistically significant differences in gender between the groups. Patients in groups B and C were older than those in group A (P < 0.05), the significance is even greater when comparing only groups A and B (P < 0.005). Duration of examination in the ER did not influence the accuracy of diagnosis, patients in groups B and C were even longer examined (P < 0.05). We have found that the proportion of patients with history of angina or MI was greater in group A than group B or C (55.1%, 31.1% and 31.3%, respectively) and this was statistically significant (P < 0.005). Smoking, high cholesterol, hypertension, and diabetes did not significantly differ between the groups. Interestingly, we have found that proportion of patients admitted during the afternoons, nights or weekends was higher in groups B and C than in group A (P < 0.05), although characteristics of these patients did not differ from those admitted during the 'ICU staff shift'. Following clinical evaluation including 12-lead ECG, all patients were subjected to routine laboratory testing (including liver and kidney functions, lipid profile, CBC, and random blood sugar). Specific laboratory tests included serum fibrinogen and plasma NO expressed as its metabolites, nitrites and nitrates (NOx), as measured by the Griess reaction. All patients and controls were subjected to diagnostic coronary arteriography with assessment of the extent of coronary disease (number of affected vessels) as well as the severity of stenosis (using the Gensini scoring system). Myocardial dysfunction is common in critically ill patients. The diagnosis of an acute coronary syndrome has been revolutionised by the protein isomer, cardiac Troponin T (cTnT). In the nonintensive care setting early recognition of patients with elevated cTnT and the instigation of appropriate treatment has been shown to reduce the risk of death and myocardial infarction. Concern has been raised about the prognostic value of cTnT in patients with renal dysfunction as troponin is renally excreted. Methods A retrospective audit was carried out on 180 consecutive admissions to a general (noncardiothoracic) intensive care unit (ICU) to establish the incidence of raised cTnT (cTnT > 0.1 ng/ml) in patients with and without renal dysfunction. Renal dysfunction was defined as Creatinine > 120 µmol/l. Results expressed as median (minimum-maximum). The outcome and cTnT of patients with and without renal dysfunction are presented in Table 1 . There was no correlation between renal function and cTnT level (r = -0.06). There was no significant difference in outcome between raised cTnT patients with and without renal dysfunction. Conclusion Elevated markers of cardiac myocyte damage are common in critically ill patients and are associated with an increased mortality rate. This effect is seen irrespective of renal function. Methods Thirty-six hours after intravenous endotoxin (LPS) administration, 26 rabbits were anaesthetised and ventilated. Systolic arterial pressure (sAP), diastolic arterial pressure (dAP), and mean arterial pressure (mAP) (mmHg), systolic aortic blood flow velocities (sAoV) and mean aortic blood flow velocities (mAoV) (20 MHz pulsed Doppler; cm/s), and systolic renal artery blood flow (sRen) and diastolic renal artery blood flow (dRen) (transonic Doppler, ml/min) were measured in anaesthetised and ventilated rabbits. Heart inotropic quality was estimated by maximal acceleration (Gmax, cm/s 2 ) and sAoV. After 30 min stabilisation, rabbits received levosimendan (200 µg/kg/hour, LS+) or saline (LS-) for 30 min. They were then separated into four groups: (1) LS+ with NE (1 µg/kg/min, 90 min); (2) LS+ with AVP (14 ng/kg/min, 90 min); (3) NE alone; (4) AVP alone. All parameters were gaussian. Statistical analysis was performed using one-way and two-way ANOVA. Results LS consistently improved sAoV and Gmax within 30 min (P < 0.05 for both), while sAP, dAP, and mAP decreased (P < 0.05). Addition of AVP or NE similarly restored mean arterial pressure. However, their effects on myocardial function diverged. While NE did not alter sAoV and Gmax, AVP dramatically deteriorated both contractile parameters (-25% and -45%, respectively, after 60 min; both P < 0.01). This effect of AVP was observed when used in combination with LS or alone. No significant effect of treatments was observed on sRen and dRen. Our study demonstrated that LS is a good alternative to restore cardiac contractile function when combined with NE. The use of AVP may lead to further deteriorate sepsis-related myocardial dysfunction even when combined with a positive inotropic agent. Levosimendan is used intravenously for the management of heart failure. At the therapeutic dose, levosimendan acts predominantly as a calcium sensitizer and ATP-sensitive K + -channel (KATPchannel) opener. It binds directly to the calcium-dependent site on troponin C, stabilizing the calcium-induced conformational change and enhances the calcium sensitivity of the cardiac myofilaments. The advantages of a calcium sensitizer over dobutamine is the lack of intracellular calcium overload and that it acts without increasing the energy demand for handling intracellular calcium. Method Fifteen patients with cardiogenic, septic or mixed cardiogenic/septic shock were administered a loading dose of 12 µg/kg/min levosimendan, followed by a continuous infusion of 0.1 µg/kg/min for 24 hours. Five patients were diagnosed with septic shock, six with cardiogenic shock and four with mixed septic and cardiogenic shock.The mean APACHE II score was 21.3 ± 6.4 The arterial blood pressures of the patients were monitored closely via arterial catheters. Intravenous noradrenaline was administered, where necessary, to maintain mean arterial pressure > 65 mmHg. Echocardiographic left ventricular ejection fraction (LVEF) (Simpson's method) and plasma B-type natriuretic peptide (Biosite Triage method) were measured before and after (within 1 hour) levosimendan infusion. The LVEF showed a significantly but relatively small improvement in response to levosimendan infusion (pre, 25.7 ± 11.0% vs post, 29.8 ± 8.6%), representing an absolute change of 4.1 ± 8.4% (P = 0.039). The plasma B-type natriuretic peptide concentrations demonstrated a significant decrease, from 993 ± 389 to 644 ± 408 pg/ml, before and after levosimendan infusion (P = 0.001). The use of levosimendan in shock patients proved to be feasible, and holds promise as a potential alternative to catecholamine inotropes. Introduction Levosimendan (LS), a new inodilator, improves survival in patients with congestive heart failure, but data on critical care patients with cardiorespiratory failure are scanty [1] . The aim of this study is to evaluate the homodynamic and volumetric response to LS in critical care patients with cardiogenic shock. Ten critical care patients with cardiogenic shock were studied. All patients were mechanically ventilated and connected to an integrated monitoring system (PICCO system/Agilent) by a fiberoptic arterial catheter (pv 2014L16) and by a Swan-Ganz catheter. At basal time (T0) and at 1 hour (T1), 6 hours (T2), 12 hours (T3), 24 hours (T4) during LS infusion and at 12 hours post LS suspension (T5), the main haemodynamic and volumetric data were studied. All data are expressed as mean ± SD. The ANOVA test for RM was used to compare changes during times study. Table 1 presents the main hemodynamic changes of cardiac output (CO), pulmonary capillary wedge pressure (PCWP), intrathoracic blood volume (ITBVI) and extravascular lung water (EVLWI). Background Levosimendan is a new calcium sensitizer with inodilatory properties. The aim of this case series report was to evaluate the effects of perioperative and postoperative use of levosimendan in cardio-surgical patients with high perioperative risk, compromised left ventricular (LV) function or with difficulties in weaning from the cardiopulmonary bypass (CPB). Sixteen cardio-surgical patients received levosimendan infusion with a maximum duration of 29 hours. Eight infusions were initiated preoperatively and eight postoperatively. Most patients received 0.1 µg/kg/min levosimendan after a bolus of 12-24 µg/kg for 10 min. The main operative indication was coronary artery disease. Seventy-five per cent of the subjects were high-risk patients. Because of cardiac pacing postoperatively, the heart rate was not analysed. Continuous infusion of levosimendan increased the cardiac index in both groups significantly. The pulmonary capillary wedge pressure and systolic blood pressure did not change. Crystalloids and vasopressors, most commonly noradrenaline and adrenaline, were administered as needed. Weaning from CPB was successful in all the patients. Two of the infusions were discontinued due to hypotension. One high-risk patient in the preoperative group died during the operation. In the postoperative group, two patients with multiorgan failure died postoperatively. Introduction The aim of the study was to determine the hemodynamics and clinical effects of levosimendan (LS) in cardiosurgical patients. LS enhances the contractile function of stunned myocardium without increasing concetrations of intracellular calcium and myocardial oxygen consumption. Methods Ten patients (age 71 ± 6 years) mechanically ventilated and sedated were included in this study (n = 10). Diagnosis was coronary artery bypass operation (n = 7), valve replacement (n = 1) and pulmonary edema (n = 2). Patients were prospectively selected to receive LS, if hemodynamic data measured by a pulmonary artery catheter and estimation of cardiac function by echocardiography indicated a need for positive inotropic support. The infusion rate of LS was 0.1-0.2 µg/kg/min. A pre-existing infusion with epinephrine or norepinephrine was titrated to maintain a mean arterial pressure of 65-100 mmHg. Crystalloids and colloids were administered to maintain pulmonary capillary wedge pressure (PCWP) > 14 mmHg. Measurements were obtained at baseline, 3 hours and 24 hours after starting LS infusion. Statistical analysis was performed with a paired t test. P < 0.005 was considered statistically significant. Results LS caused a significant increase in cardiac index (CI) from 2.2 ± 0.4 l/min/m 2 at baseline to 3.3 ± 0.5 l/min/m 2 after 24 hour infusion. The increase in CI was due mainly to increase in stroke volume, as the heart rate remained nearly unchanged during the study period. The stroke volume index increased from 22.9 ± 4 ml/m 2 to 33.9 ± 9 ml/m 2 (P < 0.005). The left ventricular ejection fraction, as estimated by echocardiography, increased from 29.5 ± 5.5% to 33.9 ± 9% (P = 0.001). The systemic vascular resistance index significantly decreased from 2539.5 ± 551.4 dyn/s/cm -5 /m 2 to 1791.5 ± 276 dyn/s/cm -5 /m 2 (P < 0.005). LS did not cause significant changes in pulmonary vascular resistance (PVRI at baseline 318.5 ± 104 dyn/s/cm -5 /m 2 , after 24 hours 293 ± 136 dyn/s/cm -5 /m 2 ). There was a fall in PCWP, although this was not significant. In this study the effects of a new calcium sensitiser were evaluated after cardiac surgery. The study demonstrates that LS exerts favourable hemodynamic responses in these patients, without increasing myocardial oxygen consumption. LS has the potential to treat low cardiac output states after cardiopulmonary bypass surgery and its use in these situations might be of special value. Acutely decompensated heart failure (ADHF) represents the most severe form of heart failure with a short-term mortality approaching 30%. The SURVIVE study is the first prospective, randomised trial utilising mortality as the primary variable in evaluating the efficacy of intravenous drug therapy in ADHF. Levosimendan (LS) is a new calcium sensitiser for the treatment of ADHF. Previous studies with LS have shown decreased mortality in comparison with placebo or dobutamine (DB). The studies have, however, not been powered for mortality. A suspicion of detrimental mortality effect of DB has also been raised, but it has not been proven in any sufficiently powered single study or metaanalysis. Therefore, DB is still the most widely used intravenous (IV) inotropic agent in the treatment of ADHF. The SURVIVE study is a multicentre, parallel-group, randomised, double-blind, double-dummy study in patients with ADHF comparing the efficacy of LS with that of DB. The SURVIVE study includes hospitalised patients with ADHF, left ventricular ejection fraction ≤ 30% and clinical need for IV inotropic support. The primary endpoint of the study is all-cause mortality during 180 days following randomisation. Patients will receive two simultaneous intravenous infusions, 'LS/placebo' with a maximum duration of 24 hours and 'DB/placebo' according to clinical judgement (but minimally for 24 hours and starting with a dose of 5 µg/kg/min). During the 180-day period, the originally randomised study drug can be readministered when clinically justified. Altogether 700 patients will be recruited. The sample size is based on assuming 30% 180-day mortality in the DB group, 33% relative risk reduction in the LS group, with a power of 85% and an alphalevel of 0.05. The study is ongoing in eight countries (Finland, France, Germany, Israel, Latvia, Poland, Russia and UK). After 450 patients have been followed for 1 month or after 90 deaths have occurred, the Data and Safety Monitoring Board will make a recommendation on the continuation/discontinuation of the study following pre-specified stopping rules. Separately, the Steering Committee follows the mortality rate without opening the treatment code and may recommend increasing the sample size. SURVIVE thus represents the first study with two ambitious goals in ADHF: (1) to study a new drug against the accepted reference treatment, DB; (2) to assess mortality. In future, other trials will probably have to follow this new standard of design in this setting. Background Hospitalisation, especially length of stay in intensive care, is the main cost driver in heart failure (HF). Levosimendan, a novel calcium sensitiser, improves both short-term and long-term outcome of patients with acute HF. Objective To evaluate the length of intensive care and hospital stay in acute decompensated HF patients treated with levosimendan compared with placebo. The REVIVE I trial was a pilot trial comprising 100 patients with acute HF who were hospitalised for worsening HF and had dyspnea at rest despite intravenous (IV) diuretics. Patients were randomised (double-blind) to receive placebo (PBO) (n = 49) or IV levosimendan (LS) (n = 51), given as a loading dose of 12 µg/kg over 10 min and followed by a continuous infusion (0.1 µg/kg/hour for 50 min and 0.2 µg/kg/hour for 23 hours). Among other measures, the duration of hospitalisations and intensive care (ICU/CCU) were prospectively recorded. At baseline, 34 out of 51 (67%) patients in the LS group, and 25 out of 49 (51%) patients in the PBO group were treated in the ICU/CCU. One levosimendan and three placebo-treated patients were subsequently admitted to the ICU/CCU after randomisation. The mean treatment time at the ICU/CCU was 4.4 days in the LS group and 5.1 days in the PBO group (median 4 days vs 5 days). The mean duration of index hospitalization after randomization was 5.7 days for the LS group and 6.8 days for the PBO group (median 5 days in both groups). After the initial discharge, one patient in the LS group and seven patients in the PBO group were admitted to the ICU/CCU during a subsequent rehospitalization up to day 31. The mean treatment time at the ICU/CCU for these patients was 2.0 days in the LS group and 4.4 in the PBO group (median 2 days vs 5 days). Of the acute HF patients who were admitted to intensive care, those treated with levosimendan spent on average 1 day less in an ICU/CCU than patients treated with usual care. Shortening the ICU treatment time by 1 day without increasing the total length of the initial hospitalisation could reduce total hospitalisation costs by up to US$2000-3000 per patient. These promising initial results await confirmation in the ongoing REVIVE II trial. A total of 100 patients who were hospitalized for worsening HF and had dyspnea at rest despite IV diuretics were randomized (double-blind) to receive placebo (PBO) (n = 49) or IV levosimendan (LS) (n = 51), given as a loading dose of 12 µg/kg over 10 min and followed by a continuous infusion (0.1 µg/kg/hour for 50 min and 0.2 µg/kg/hour for 23 hours); patients were then followed closely for an additional 4 days. Patients were classified as improved if they reported their HF to be moderately or markedly improved at specific timepoints. Patients were classified as worse if they died or received an IV medication for worsening HF during the 5-day study period. Patients were considered unchanged if they were neither improved nor worse. Different models using different definitions for improvement and worsening were prospectively and retrospectively examined. The initial model defined improvement based on the responses at 24 hours and at 5 days and restricted the definition of worsening to the use of IV vasodilators or inotropic agents. Using this model, improvement was observed more frequently with LS than with PBO (49% vs 33%), with no between-group difference in the proportion of patients who were worse (overall P = 0.229). When the definition of worsening was expanded to include the use of IV diuretics for worsening HF and confined to the occurrence of clinical events, LStreated patients not only were more likely to show improvement (51% vs 33%) but were less likely to exhibit worsening (20% vs 35%) (overall P = 0.043). When the definition of improvement was expanded to include the responses at 6 hours (in addition to 24 hours and 5 days), the separation between the treatment groups increased even further (improvement in 33% vs 14% and worsening in 24% vs 37%) (overall P = 0.029) (LS vs PBO). The greater level of improvement in the clinical composite on levosimendan was supported by significant reductions in median plasma B-type natriuretic peptide concentrations compared with placebo at both 24 hours (-303 pg/ml vs -77.5 pg/ml, P = 0.001) and 5 days (-376 pg/ml vs -99.5 pg/ml, P = 0.027). These findings indicate that a clinical composite approach can be used to develop an endpoint that distinguishes the effects of IV LS in the setting of acute decompensated HF. The new endpoint is now being used prospectively to evaluate the effects of LS in a definitive second study (REVIVE-2). The calcium sensitizer levosimendan is a novel inotrope used for the treatment of cardiac failure. Animal data suggest pulmonary vasodilating properties and levosimendan has been suggested for the treatment of right ventricular (RV) failure despite scarce data in patients [1] . Settings An ICU and echo-laboratory in a tertiary teaching hospital. We reviewed prospectively recorded transthoracic echocardiograms of seven cardiac failure patients and six critically ill patients before and after 24 hours of levosimendan infusion (bolus 12 µg/kg + 0.1 µg/kg/min) in whom continuous wave Doppler signal (CW) of peak tricuspid regurgitant (TR) flow velocity recordings before and after levosimendan infusion were available. The left ventricular ejection fraction (LVEF) was measured by the Simpson's method. RV contractility was assessed by the maximal tricuspid annulus displacement during the cardiac cycle by M-mode (six patients only) and/or by a blinded observer. The maximal RV-right atrium gradient (TRmax) calculated from the CW recording (by the modified Bernoulli equation) of peak TR flow velocity in parasternal short axis or apical four-chamber views is regarded as a preload independent estimate of pulmonary artery vascular resistance (PVR) [2] . Values are expressed as the mean ± SD. The Wilcoxon Signed-Rank test was used to calculate the difference before and after levosimendan infusion. The patients were 10 males and three females with mean age 66 ± 14 years. The mean TRmax was 38.0 ± 9.2 mmHg prior to levosimendan infusion. After 24 hours of levosimendan infusion TRmax decreased to 30.9 ± 6.7 mmHg (P = 0.0015). In patients with markedly elevated baseline TRmax (> 35 mmHg), the median drop in TRmax was more pronounced then in patients with mild elevation of baseline TRmax (7.8 mmHg vs 2.4 mmHg). Although LVEF increased in all patients (from 18 ± 6% to 25 ± 7%) and RV contractility assessed by M-mode was better (from 1.48 ± 0.2 cm to 1.61 ± 0.2 cm), the blinded observer was not able to confirm improvement in RV contractility. Conclusion Levosimendan reduced te TRmax of peak TR flow velocity in cardiac failure and critically ill patients, implying a reduction in PVR. The degree of PVR reduction is greater in patients with baseline pulmonary hypertension. Vasodilators might cause a deterioration in gas exchange. There have been isolated case reports where oral Sildenafil, a selective phosphodiestrase inhibitor (PDEI), has been used as an alternative to prostacyclin in primary PH with early success. Our aim was to assess acute and short-term effect of Sildenafil in patients with PH with different etiologies. Seven patients have been studied (four female, three male; mean age 49 ± 12 years). They included two patients with primary PH, two patients with Eisinmenger syndrome, two patients with thromboembolic PH and one patient with Bilharzial PH. Following clinical evaluation, all patients were subjected to Swan-Ganz catheterization and the mean pulmonary artery pressure (mPAP), pulmonary vascular resistance (PVR), and mixed venous oxygen saturation (mVO 2 ) were invasively measured. Patients were also subjected to echocardiographic evaluation of the right ventricular diameter (RVD) in the short axis, of the left ventricular stroke volume (SV) and of the cardiac output (COP). Readings were recorded before, 3 days and 3 months after the start of Sildenafil. We used oral Sildenafil (25 mg) every 6 hours for all patients. Out of the seven patients, five showed significant clinical (New York Heart Association [NYHA] class IV to NYHA class III), hemodynamic and echocardiographic improvement 3 days after therapy. The five patients included two patients with primary PH, one patient with Eisenmenger syndrome, one with Bilharzial PH and one thromboembolic patirnt. The latter patients showed significant reduction of mPAP (107 to 73 mmHg, -32%, P < 0.01), mPVR (1988 to 1177 dynes/s/cm 5 , -41%; P < 0.01) with insignificant rise in mVO 2 (48 to 53 Torr). All hemodynamic changes occurred without significant reduction of arterial blood pressure and SVR. Echocardiography showed insignificant mild reduction in RVD (6.6 to 6.4 cm) with a significant rise in SV (35 to 42 ml), and COP (3.7 to 4.2 l/min). Follow-up 3 months later showed improvement in four out of the latter five patients. The fifth patient is the Bilharzial PH one who died suddenly 5 days after discharge. The other four patients showed further subjective improvement (NYHA class III to II), reduction of mPAP (73 to 57 mmHg, -22%; P < 0.05), mean PVR (1177 to 759 dynes/s/cm 5 , -36%; P < 0.05) with a further rise in mVO 2 (53 to 60 Torr). Echocardiography also showed a significant reduction of the RVD (6.4 to 5.6 cm) with a further rise in SV (42 to 48 ml) and COP (4.2 to 4.7 l/min, P < 0.01). Conclusions (1) Sildenafil proved to be effective in the acute and short-term condition, subjectively and objectively, especially in patients with primary PH. Yet, the long-term effect and 5-year mortality are to be evaluated. (2) The effect of Sildenafil on Bilharzial PH is guarded. A larger group of Bilharzial PH patients should be studied, especially the milder forms. Interventions None. A total of 453 patients were included. AF occured in 24 patients (5.3%). In univariate analysis advanced age, pre-existing cardiovascular disease, previous treatment by calcium-channel blockers, and Simplified Acute Physiologic Score (SAPS II) were significant predictors of AF. Patients with AF received significantly more fluids and catecholamines, and experienced more sepsis, shock (especially septic shock), and acute renal failure. The severity (SAPS II), the ICU workload (OMEGA), the ICU and hospital length of stay, and mortality were significantly increased in patients who developed AF. Multivariate analysis identified five independent predictors of AF: advanced age, blunt thoracic trauma, shock, pulmonary artery catheter, and previous treatment by calcium-channel blockers. The incidence of AF in a SICU appeared more frequent than in the general population but less than in the cardiac surgery unit. The onset of AF reflects the severity of the disease and is associated with an increased mortality and morbidity. Five independent risk factors of AF were identified. AF seems to be more frequent after blunt thoracic trauma. [1] . In cardiac surgery at the present time there are few studies devoted to the evolution of natriuretic peptides into the perioperational period [2] [3] [4] [5] . Objective The aim of our study was to examine the release pattern of BNP and NT proBNP in the perioperative period of coronary surgery in two groups of patients: ventricular ejection fraction (VEF) > 50% and VEF < 50%. A secondary aim was to compare plasmatic levels of BNP and NT proBNP with the homodynamic parameters. Methods (1) Twenty-one patients undergoing coronary artery bypass grafting (CABG) by sternotomy, cardiopulmonary bypass (CPB) and moderate systemic hypothermia were divided into two groups: group I = 11 patients with normal left ventricular ejection fraction (LVEF > 50%) and group II = 10 patients with deteriorated LVEF (LVEF < 50%). Intraoperative anesthetic management was uniform for all patients. (2) Blood sampling was taken before surgical incision, after removal of the aortic cross clamp (10 min following cross-clamp removal) and after surgery: at the arrival in intensive care (T0) and 3, 6, 9, 12, 24, and 48 hours and 8 days after CPB. BNP and NT proBNP concentrations were measured by electrochemiluminescence immunoassay. (3) Hemodynamic parameters were measured at the same time with blood samples taken. Statistics Statistical comparisons were performed by the Mann-Whitney U test and the Wilcoxon nonparametric test. P < 0.05 was considered to indicate statistical significance. Results BNP and NT proBNP decreased when the aortic clamp was applied. After removal of the aortic cross clamp, BNP and NT proBNP increased gradually in the two groups. The peak of BNP ranges between 12 and 24 hours and the peak of NT proBNP ranges between 24 and 48 hours. BNP and NT proBNP increased in the same manner, in a similar exponential fashion, until 12 hours after surgery. The logarithmic model showed the same constant of time in both groups. There is a good correlation between BNP and NT proBNP with the hemodynamic parameters of the left ventricle. In this cohort of uncomplicated CABG patients, BNP and NT proBNP increase in a similar exponential fashion until 12 hours after surgery. This very strong correlation between BNP and NT proBNP at all time points with baseline values implies that there is a similar process occurring in all our patients undergoing cardiac surgery. NT proBNP is more sensitive than BNP to the change of the hemodynamic parameters. Results Before CP myocardial necrosis was not detectable. Release of TnI and CK MB to the CS appeared from the first minute of reperfusion, and was increased in time (Fig. 1 ). The early postoperative performance of the heart was not inferior to pre-CPB values ( Table 1) . Conclusion CP is associated with irreversible myocardial injury. The contractile function of the heart evaluated by pulmonary artery catheter provides little information of the exent of myocardial damage. Methods One hundred and twenty-three very elderly patients with AMI were admitted to the Chinese PLA General Hospital from 1 January 1993 to 31 August 2002. Patients were divided into two groups. The patients in group A died and the patients in group B were alive in hospital within 30 days. Their clinical characteristics, risk factors, clinical presentation, treatment and complications were analyzed. Results Twenty-six patients died of AMI within 30 days. Univariate analysis indicated that histories of diabetes, prior myocardial infarction and cerebral infarction, high white blood cell count (≥ 10 × 10 9 /l), low left ventricular ejection fraction (< 50%), inhospital complications including arrhythmia, pump failure and pneumonia, as well as the therapy without aspirin were significantly associated with the inhospital mortality within 30 days. Multivariate logistic regression analysis, using mortality as the dependent variable and using the history, inhospital complications and so on as the independent variable, showed that the major determinants of the inhospital mortality were the histories of diabetes (odds ratio = 3.58, 95% confidence interval [CI] = 1.08-11.90) and cerebral infarction (odds ratio = 6.82, 95% CI = 1.55-29.98), and inhospital complication of pump failure (odds ratio = 13.11, 95% CI = 3.84-44.78). Conclusions These findings demonstrate that the histories of diabetes and cerebral infarction, as well as inhospital complication of pump failure were the independent impact factors of short-term prognosis in very elderly patients with AMI. Myocardial infarction occurred within less than 7 days in 80% and less than 24 hours in 60%. One vessel was treated in 83% of the cases. The ratio of stented segments was 0.56. The success/segment was 84%. Full procedural success was obtained in 80%, incomplete success in 4%. Sixteen per cent of the attempts failed. Inhospital mortality was 54%. Inhospital major adverse events (death, CABG, myocardial infarction) occurred in 62%. Survival at 1 year and 5 years was 41 ± 4% and 40 ± 4% (NS), respectively. Survival at 5 years after a successful procedure was 44 ± 5% versus 17 ± 8% after incomplete success/failure (P < 0.001). Multivariate analysis showed four independent variables for the long-term outcome: age, left ventricular dysfunction, extent of coronary artery disease and procedural result. Aim To assess the safety, feasibility and efficacy of the Impella device in patients with cardiogenic shock or patients undergoing high-risk surgery or PCI. Study A triple-centre, prospective, nonrandomised two-arm study. The patient population was: Arm 1, cardiogenic shock patients with low cardiac output defined as a cardiac index < 2.0 l/min/m 2 and pulmonary capillary wedge pressure (PCW) > 18 mmHg, or patients with PCW > 18 mmHg and a systolic blood pressure (BP) < 90 mmHg or in need of inotropes to keep systolic BP > 90 mmHg; and Arm 2: high-risk CABG or high-risk PCI following the Euroscore criteria. Sample size: 20/45 patients are included. Hemodynamics included the assessment of cardiac output (CO), PCW, arteria pulmonalis (AP), arterial BP and intracardiac pump parameters. Biochemistry included free hemoglobin at baseline, each hour during the pump run and 6 hours after removal of the pump. The results of the first 20 patients are presented in Tables 1 and 2 . Conclusions These preliminary data indicate the feasibility of the Impella pump for patients in cardiogenic shock, high-risk CABG and PCI. More data will be available in March 2004. Results and discussion Sixty (92.3%) patients were succesfully treated. Fifty-six (86.1%) had a total recovery, four (6.1%) had a partial response and treatment failed in five patients (7.6%). Embolic migration occurred in five (7.6%) patients and two patients (3%) died because of it. Three patients (4.6%) had some hemorrhagic disorders. Conclusions Thrombolytic therapy is feasible and safe. Surgical treatment is not contraindicated even if thrombolitic therapy partially failed allowing better hemodynamic status after all. Introduction Deep vein thrombosis (DVT) is a serious complication that may develop in critically ill ICU patients as a consequence of immobilization, femoral central venous catheter (CVC) placement and activation of the thrombotic cascade. We investigated the epidemiology of DVT in critically ill ICU patients expected to be hospitalized for at least 2 weeks. We used a triplex examination of lower extremities on admission, on days 7 and 14, and on clinical suspicion of DVT. Because of its invasive nature and technical complexity, pulmonary angiography (PA), the golden diagnostic standard, was not a routine procedure. We hereby report our experience with emergency PA in clinically suspected cases of PE to highlight its merits and the limitations of clinical examination. We studied 18 patients with clinically suspected PE (six male, 12 female; mean age 49.5 years). Predisposing factors included heart disease in two patients, diabetes mellitus in five patients, polytrauma in three patients and autoimmune disease (i.e. Behcet disease) in one patient. Four patients were dehydrated and bedridden. Following clinical evaluation, elctrocardiogram and chest X-ray, all patients. were subjected to routine laboratory evaluation, arterial blood gas measurement and specific coagulation profile (fibrin, fibrin degradation products, D-dimer). All patients were then subjected to first-pass radionuclide angiography. PA was done in all patients within a mean period of 2 days (day 0-day 4). Following acute imaging, PA revealed the presence of PE in only eight patients in the form of distal cutoff and/or filling defects, while 10 patients had negative PA for PE. Compared with patients with negative PA, those with positive findings were more frequently hypotensive (50% vs 20%), more hypoxic (100% vs 90%), more congested (100% vs 80%) with more positive echocardiographic data (85% vs 60%). They also exhibited significant scintigraphic evidence of impaired RV ejection fraction than patients with negativeve PA (80% vs 20%). Conclusion Emergency PA is a feasible, safe, highly sensitive diagnostic tool in acute PE before starting intrapulmonary or systemic thrombolytic therapy with its potential hazards. In view of the ready availability of the catheter laboratory as well as its safety and ease of performance, emergency diagnostic PA is recommended in suspected clinical settings of PE. Objective The primary objective of this study was to estimate the prevalence and incidence of diagnostically confirmed DVT and PE in medical-surgical ICU patients. The secondary objective was to examine VTE prophylaxis longitudinally, estimating the proportion of VTE events associated with prophylaxis failure versus failure to implement prophylaxis. The tertiary objective was to estimate the morbidity and mortality outcomes of patients with VTE. Design A 1-year retrospective observational multicenter cohort study of patients admitted to an ICU during 2000. Setting Twelve university-affiliated closed medical-surgical ICUs in eight cities in three provinces. We identified medical-surgical adult ICU patients who had either upper or lower limb DVT or PE in the 24 hours preceding ICU admission up to 48 hours post-ICU admission (prevalent cases), and any time during the ICU admission, or up to 8 weeks following ICU discharge (incident cases). DVT was diagnosed by compression ultrasound of the upper or lower extremity, or venography. PE was diagnosed by ventilation perfusion lung scan, chest computerized tomography, pulmonary angiogram, echocardiogram, S1Q3T3 on EKG, or autopsy. Based on a priori criteria, patients were categorized as definite or indeterminant VTE. The burden of illness associated with diagnostically confirmed DVT and PE in medical-surgical critically ill patients is low in comparison with higher event rates of 10-20% detected by screening ultrasonography and autopsy studies. Most VTE events, although underdiagnosed, are associated with prophylaxis failure rather than failure to prophylax. More active implementation of VTE prevention strategies are needed in the ICU, as well as more rigorous evaluation of VTE prevention strategies. Acknowledgement This study was funded by an unrestricted grant from Pharmacia, Inc., the Physicians' Services Incorporated of Ontario, and the Ontario Thoracic Society. Events Definite or indeterminite Definite The term hypercoagulability has been randomly used to describe the increased risk of pathologic thrombus formation. Traumatic brain injury (TBI) is often complicated by life-threatening thromboembolic events [1] . In the trauma patient the incidence of deep venous thrombosis (DVT) is estimated to be from 20% to 90% and to be 4-22% for pulmonary embolism (PE) [2] . The use of the thrombelastograph (TEG) has proven useful in determining an adult patient's coagulation status by measuring the enzymatic/protein elements of coagulation, platelet function, and fibrinolysis from a whole blood sample [3] . The purpose of this study is to compare the TEG as a screening tool for hypercoagulability against other known parameters. University of Mississippi Medical Center, Jackson, MS, using 30 TBI patients, 22 male and eight female, ranging in age from 18 to 79 years. TEG panels were drawn using either venous or arterial blood at 24 and 96 hours post TBI admission. Thirty replicated measurements were made with a two-channel TEG machine. A total of 13 subjects were considered to be hypercoagulable, eight were emerging hypercoagulable and nine were normal. The fibrinogen level as well as the factor 8 level were measured at the same time. Intrasample variation as to expected fibrinogen results were estimated by the coefficient of variation for each subject. A random effect analysis of variation (ANOVA) was performed and the coefficient of variation (r) was estimated by measuring the ratio of hypercoagulate to fibrinogen and factor 8. A level of confidence > 0.05 was considered statistically significant. Coefficient of variation was 0.96 for the fibrinogen level and 0.62 for factor 8. Although the sample size is small, we suggest that teh TEG is a useful screening tool for hybercoagulability in TBI and hence careful monitoring for DVT and PE. Introduction Heparin-induced thrombocytopenia (HIT-II) is the most significant complication of heparin therapy depending on the type and the duration of therapy, and the patient population. Although its incidence is known to be high after cardiac and orthopaedic surgery [1] [2] [3] , it remains unknown in patients of a general intensive care unit (ICU patients). Purpose To investigate the frequency and clinical significance of positive heparin-platelet factor-4 antibodies (HPF4-Abs) in ICU patients. We prospectively investigated the incidence of HPF4-Abs in ICU patients, which were treated with unfractionated heparin (UFH) or low molecular weight heparin (LMWH) as prophylaxis or treatment of venous thromboembolism or intermittent flushing solution of UFH (5 IU/ml) for maintenance of central venous or arterial catheter patency (FUFH). The HPF4-Abs were performed by enzyme-linked immunosorbent assay (ELISA) (Asserachrom HPIA; Stago) on days 1 (admission), 7 and 15 of the treatment. The result was defined as negative if absorbance at 492 nm (A492) was < 0.671 and as positive if A492 ≥ 0.671. HIT-II was defined by clinical evaluation. We investigated 55 patients (16 female; median age 60 years, range 12-92 years) having an illness severity score APACHE II of 20.28 ± 6.8. Eleven patients were treated with FUFH, 38 with FUFH and LMWH, three with LMWH and one with UFH. Positive HPF4-Abs were detected in 17 (30.9%) patients on day 7 and 21 (39.5%) patients on day 15 ( Table 1) . None of the patients developed HIT-II. Despite the high prevalence of positive HPF4-Abs in ICU patients treated with FUFH or LMWHs, its clinical significance remains to be investigated in a larger number of patients, especially in case of re-exposure to heparin. Introduction Abnormal endothelial physiology has been implicated both in early atherogenesis and, later, in the control of dynamic plaque behavior. The biologic link between endothelial damage and atherosclerosis may be related to decreased arterial bioavailability of nitric oxide (NO), which may predispose to leucocyte and platelet adhesion, vasoconstriction and smooth muscle cell proliferation. Substances released by the endothelium include prostacyclin, NO, endothelin, von Willebrand factor and thrombomodulin (TM), and so on. TM is an integral membrane glycoprotein that can change the function of thrombin, to an anticoagulant through activation of protein C -which, in the presence of protein S, inactivates factor VIIIa and factor Va, and thereby inhibits further formation of thrombin. Soluble TM is thought to indicate endothelial-cell damage. A positive relation between the concentration of soluble thrombomodulin and the risk of atherosclerotic disease is widely assumed. Aim of the study To assess the diagnostic role of circulating TM as a marker of the extent and severity of coronary arterial atherosclerosis. We studied 150 patients with ischemic heart disease (118 male, 32 female; mean age 53.4 ± 8.3 years, ranging from 32 to 80 years), together with 20 nonischemic patients who were catheterized prior to valve replacement (13 male, seven female; mean age 58.4 ± 4.5 years) serving as controls. Of the 150 ischemic patients, 77 had anginal pain (AP) and 73 had acute myocardial infarction (AMI). Following clinical evaluation, all patients were subjected to 12-lead ECG and routine laboratory work, and coronary arteriography to assess the extent and severity of the stenotic lesions using the Gensini scoring system. All patients had TM levels, measured in arterial samples withdrawn from the coronary artery during the catheter procedure using the enzyme immunoassay ELISA. Compared with the control subjects, the ischemic patients exhibited significantly higher levels of serum TM (42.5 ± 15.4 vs 2.8 ± 0.8, P < 0.0000), with progressively higher levels of serum TM from the anginal group to the AMI group (35.7 ± 11.3 vs 49.6 ± 16.1, P < 0.000001). Serum TM correlated significantly with the severity of coronary artery pathology, expressed as the Gensini score with P < 0.000 for the angina group and P < 0.000 for the AMI group. Again, both groups exhibited significant correlation between serum TM and the number of diseased vessels, with P < 0.000 for both groups. Conclusion TM, an endothelial glycoprotein resulting from the damage to the vascular endothelium by the atheromatous process, showed significant correlation with the extent and severity of coronary artery disease (expressed by the Gensini score system). The higher TM level in the AMI group compared with the AP group points to the more significant endothelial damage in the former compared with the latter. That stresses the importance of serum TM as a molecular marker of endothelial dysfunction in acute ischemic syndromes. Background Plasmatic activity of antithrombin (AT), a physiological coagulation inhibitor with anti-inflammatory properties, is a marker of severity validated in sepsis. The nonthyroidal illness syndrome, whose first stage is a decrease in free tri-iodothyronine (fT3) plasmatic concentration, is frequently observed in intensive care unit (ICU) patients. Plasmatic levels of fT3 and free thyroxine (fT4) are respectively in correlation with severity and prognosis. The aim of this study is to determine whether a significant correlation exists between T3 and AT in sepsis. Methods Eighty-two patients, with 28 of them suffering from sepsis, admitted in the ICU of a general hospital over a 5-month period, were included at random. The exclusion criteria were a history of dysthyroidism or treatment intake that affects thyroid function. The levels of AT, thyroid stimulating hormone (TSH), fT3, fT4, C reactive protein (CRP) and leukocyte count were measured at admission. These investigations concerning blood samples were part of the routine biological balance. APACHE II and SAPS II scores were calculated using data from the first 24 hours of the ICU stay. Patients were followed until ICU discharge or death to determine the survival rate. P < 0.05 and r < 0.05 were considered significant. Results A nonlinear significant correlation between AT and T3 was shown in our nonseptic (P = 0.0043, r = 0.393) and septic (P = 0.00220, r = 0.449) population, whereas it was neither with T4 (nonseptic group, P = 0.571; septic group, P = 0,265) or with TSH (nonseptic group, P = 0.993; septic group, P = 0.421). Only in the septic group were AT and fT3 significantly in correlation with gravity, evaluated by APACHE II/SAPS II scores. A significant correlation between fT4 at admission and survival (P = 0.0208) existed only in patients with septic shock. fT3 (1.39 ± 0.56 ng/l vs 2.04 ± 0.54 ng/l, P < 0.0001), as AT (51 ± 22.82% vs 82 ± 23.62%, P < 0.0001), was significantly lower in the septic group. Among other measured parameters, CRP (131 ± 75 mg/l vs 44 ± 53 mg/l, P < 0.0001) and fT4 (9.4 ± 4.15 vs 11.37 ± 3.25 ng/ml, P = 0.0239) were discriminative between the two groups, contrary to the leukocyte count (P = 0.1375). We proposed an explanation of the inter-relation between AT and T3 in sepsis, based upon nuclear properties of T3 on the AT gene and upon effects of AT on the NF-κB path, regulating the peripheral deiodination of T4 to T3. Conclusions A significant correlation was underlined in the septic group, at ICU admission, with T3 and AT, whereas it was neither with T4 or with TSH, and with T3, AT and APACHE II/SAPS II gravity scores. A significant correlation between fT4 at admission and survival was found, but only in the septic shock subgroup. The decrease in fT3 is involved in the decrease in AT, reliable to gravity and prognosis. A physiopathological explanation in sepsis could imply NF-κB. Measurement of AT realized in septic patients could be usefully completed by fT3 measurement. Heparinised venous blood samples were obtained from 10 healthy adult volunteers, and PMNL were isolated by density gradient centrifugation and suspended in minimum essential medium supplemented with 10% autologous plasma. Cells were incubated for 16 hours with either 2 µg/ml lipopolysaccharide (LPS) (stimulated) or phosphate-buffered saline (unstimulated) in the presence of r-SPC or EI, or both. Cells were washed and treated with Annexin V/propidium iodide stains and analysed by dual-laser flow cytometry. The percentage of early apoptotic cells was determined using four-quadrant analysis. Data were analysed using Friedman analysis of variance with post hoc testing by Wilcoxon signed ranks testing for paired data ( Table 1) . The results of this pilot study show that r-SPC significantly increases activated PMNL apoptosis, and this is maintained in the presence of EI. This suggests the possible use of r-SPC and elastase inhibitor in inducing neutrophil apoptosis and hence clearance of neutrophils in ALI/ARDS. Further studies are needed. Available online http://ccforum.com/supplements/8/S1 Background Cyclooxygenase-2 (COX-2) is the inducible COX isoform that catalyzes the formation of prostaglandins in response to proinflammatory cytokines. Prostacyclin, the major prostanoid generated by endothelial cells, is a potent inhibitor of platelet aggregation and a powerful vasodilator. Although the role of Drotrecogin alfa (activated) (recombinant human activated protein C [rhAPC]) in modulating microvascular coagulation through the inhibition of thrombin generation has been well studied in experimental and clinical settings of severe sepsis, little is known about its effects on prostanoid release from endothelial cells. The effect of rhAPC (1-20 µg/ml) on the expression of COX-2-mRNA in human umbilical vein endothelial cells (HUVEC) was measured by a colorimetric assay (Quantikine assay™). The COX-2 protein content in endothelial cells was determined by western blots. Cell supernantants were assayed for 6-keto-PFG1α, the stable hydrolysis product of prostacyclin, by ELISA. Statistical analysis was performed by unpaired Student's t test and ANOVA. Results rhAPC dose-dependently upregulated COX-2 mRNA in endothelial cells after an incubation time of 4 hours (controls, 11.7 ± 0.6 amol/ml; HUVEC treated with 20 µg/ml rhAPC, 18.4 ± 0.8 amol/ml; mean ± SEM, P < 0.001). Western blot analysis revealed an increase of COX-2 protein content after a 10-hour treatment with rhAPC (1 µg/ml). rhAPC dose-dependently increased 6-keto-PFG1α-levels in HUVEC supernatants (controls, 62.5 ± 8.7 pg/ml; HUVEC treated with 20 µg/ml rhAPC, 151.7 ± 16.3 pg/ml, P < 0.001). As shown by antagonists to the thrombin receptor PAR-1 (WEDE 15 and ATAP2) and by a monoclonal antibody (RCR-252) against the endothelial protein C receptor (EPCR), the effect of rhAPC on COX-2 mRNA upregulation was mediated by the EPCR receptor. The ability of rhAPC to upregulate COX-2 in human endothelial cells, as well as the release of prostanoids, very probably mediated by the EPCR receptor, may represent a new molecular mechanism, by which oxygen delivery at the site of injury may be improved and intravascular aggregation of platelets may be reduced, thereby contributing to the efficacy of rhAPC in systemic inflammation and sepsis. ). Severe thrombocytopenia might be a risk factor for serious bleeding under this therapy. The median plasma concentration of APC at steady state is 0.045 µg/ml. A previous study revealed an APC-dependent reduction of CD62P on the platelet surface exclusively after activation with recombinant tissue factor. However, the amounts of APC used were high, above steady-state concentrations (0.5-20 µg/ml), and the effect of APC in therapeutic concentrations on the expression of platelet receptors has not yet been studied systematically. Objective To evaluate the influence of drotrecogin alfa (activated) on in vitro expression of platelet receptors at a therapeutic concentration. Methods Citrated blood was drawn from healthy blood donors (n = 22, 45.5% male, age 38 ± 13 years [mean ± SD]). Exclusion criteria were smoking, diabetes and use of drugs interfering with platelet function. Blood samples were adjusted with APC to final concentrations of 0.045 µg/ml APC (group 1, therapeutic dose) and 0.225 µg/ml APC (group 2, fivefold therapeutic dose), respectively. The control group received no additional APC. To evaluate platelet reactivity, samples were activated with 5 µM thrombin-receptor-agonist-peptide-6 (TRAP-6; Bachem, Heidelberg, Germany) or 2.5 µM adenosine-di-phosphate (ADP; Sigma, Taufkirchen, Germany). Samples were incubated for 10 min at 37°C with fluorescence-labeled monoclonal antibodies (mAB) against CD62P (fluorescein isothiocyanate, FITC), CD41 (phycoerythrin [PE]), CD42b(PE), CD45(FITC) (all from Beckman-Coulter, Krefeld, Germany) or PAC-1(FITC) (Becton Dickinson, San Jose, CA, USA). Analyses were performed in a flow-cytometer (Epics XL; Beckman-Coulter). The mean fluorescence intensity was calculated for all mAB except CD45. After gating granulocytes, the percentage of CD45/41-positive complexes was counted. Both calculations were run using the WinMDI software. Statistics for intergroup differences were performed by one-way ANOVA. Results APC in group 1 had no significant influence on platelet activation, with and without stimulation. In group 2, CD62P and CD45/41 showed a slight but nonsignificant decrease. Conclusion This study demonstrates that therapeutic plasma concentrations of drotrecogin alfa (activated) have neither an influence on expression of platelet activation markers nor on platelet-granulocyte complexes in vitro. Thus, a disturbance of primary hemostasis seems unlikely. Statistics The Wilcoxon signed-ranked test was used for paired samples from the same patients; the Mann-Whitney U test was used for comparisons with controls. Higher levels of TATc and lower levels of PAA were found in infected lungs compared with noninfected lungs and lungs from controls (P < 0.05). Protein C and APC concentrations were significantly decreased at the infected site (P < 0.01). See Fig. 1 . Pneumonia is characterized by a strong procoagulant shift at the site of infection, caused by local activation of coagulation, inhibition of fibrinolysis, and low levels of APC. Horizontal lines, median ± interquartile range from healthy (upper panels) and mechanically ventilated (lower panels) controls. Subjects and methods A retrospective clinical study was performed in 454 consecutive patients who were admitted to our intensive critical care unit between 1 June and 1 October 2003. The patients whose platelet count decreased to be less than 10 × 10 4 /µl during the course of a disease were regarded to be in a state of pre-DIC, and the relation between the subsequent decrease rate of platelets (∆PLT) and the occurrence of DIC was investigated. ∆PLT was obtained as follows: (platelet count when platelets started to decrease -minimum platelet count subsequently observed / the number of days required to observe the minimum platelet count after platelets started to decrease). The platelet count decreased to be less than 10 × 10 4 /µl in 41 out of 454 patients during their hospital stay. Among these 41 patients, 28 (70%) developed DIC. There was no significant difference in age, sex, APACHE II score, SOFA score, or frequency of systemic inflammatory response syndrome between DIC and non-DIC patients. When the platelet count, the International Normalised Ratio, the fibrinogen level, the fibrin/fibrinogen degradation product (FDP) value, and the most abnormal white blood cell count in non-DIC patients were compared with those observed at the time of DIC onset in DIC patients, only the FDP value was significantly higher in DIC patients (non-DIC:DIC, 8.49 ± 3.40 µg/ml:59.6 ± 86.2 µg/ml; P < 0.05). ∆PLT was significantly higher in DIC patients (6.55 ± 8.41/µl/day) than in non-DIC patients ([2.24 ± 1.87] × 10 4 /µl/day). The patients who had less than 10 × 10 4 /µl platelets and whose platelets decreased by 6.55 × 10 4 /µl/day on average had a strong possibility to develop DIC. Background The family of natriuretic peptides comprises several structurally related 22-53 amino acid peptides, such as atrial natriuretic peptide (ANP) and brain natriuretic peptide (BNP), which are vasoactive peptides that have vasodilator and diuretic properties and play an important role in cardiovascular homeostasis. The salutary cardiovascular effects of natriuretic peptides suggest that ANP and BNP have a pathophysiological significance in cardiac depression in septic patients. Aim of the present study was to determine plasma levels of the stable N-terminal prohormone forms of ANP (Nt-proANP) and BNP (Nt-proBNP) in septic patients treated with and without Drotrecogin alfa (activated). Troponin I (TNI) as a parameter for cardiac dysfunction was also evaluated. Patients and methods Nt-proANP, Nt-proBNP and TNI levels were measured in plasma samples from 40 septic patients at day 1 of severe sepsis and from 25 septic patients treated with Drotrecogin alfa (activated) by ELISA methods. Statistical analysis was performed with the unpaired Students t test. The mean SAPS II scores from septic patients on day 1 treated with and without Drotrecogin alfa (activated) were comparable (SAPS II score mean = 32). Significantly lower concentrations of NT-proBNP (P = 0.0075) and NT-proANP (P = 0.0366) were measured in the patients treated with Drotrecogin alfa (activated) as compared with the patients not receiving Drotrecogin alfa (activated). TNI values were significantly lower (P < 0.05) in septic patients treated with Drotrecogin alfa (activated) (mean ± SD = 3.3 ± 7.2 ng/ml) as compared with untreated patients (mean ± SD = 0.3 ± 0.97 ng/ml). There was a correlation between NT-proBNP and NT-proANP values (r = 0.6381, P < 0.001). Septic patients with NT-proBNP levels > 1600 fmol/ml have a 3.7-fold higher risk of death than patients with NT-proBNP levels < 1600 fmol/ml (P = 0.05). Conclusion Nt-proANP and Nt-proBNP levels are significantly lower in septic patients treated with Drotrecogin alfa (activated). Nt-proBNP can serve as a predictor for survival in septic patients. Drotrecogin alfa (activated) may influence cardiac depression in septic patients by controlling the pathways of NT-proANP and Nt-proBNP production. The beneficial effect of Drotrecogin alfa (activated) on cardiac function is also reflected by lower TNI values in this patient group. Methods Inclusion/exclusion criteria were similar to PROWESS. Patients eligible for participation had a known or suspected infection, met three or four criteria defining systemic inflammatory response syndrome and one or more acute sepsis-induced (< 48 hour duration) organ dysfunctions. Patients were classified by the time interval from the first documented organ dysfunction to administration of DrotAA (time-to-treatment), for patients that had received DrotAA within 24 hours (n = 1128) versus > 24 hours (n = 1246) after the first documented organ dysfunction. Results Figure 1 shows overall Kaplan-Meier survival curves for PROWESS and ENHANCE (28-day percent mortality shown). Survival curves for patients receiving DrotAA were essentially identical for PROWESS and ENHANCE through 28 days. These results were maintained in patient subgroups of baseline APACHE II score of 25 or more (23.0%, n = 430 vs 27.4%, n = 432) and two or more organ dysfunctions (25.2%, n = 894 vs 28.6%, n = 1110). Similar results were observed in the MOD subgroup as in the overall population. ENHANCE MOD patients tended to have lower APACHE scores but greater organ support at baseline compared with PROWESS patients. Mortality was similar for DrotAA-treated patients in ENHANCE and PROWESS, and both were lower than PROWESS placebo patients. Serious bleeding was numerically higher in ENHANCE than in PROWESS. Patients treated within 24 hours of the first organ dysfunction had lower 28-day mortality (25.2%, n = 894) than those treated after 24 hours (28.6%, n = 1110). Conclusions Results from the global ENHANCE reinforce the benefit-risk profile of DrotAA observed in the PROWESS MOD subgroup. In addition, earlier treatment (≤ 24 hours) appears to be associated with lower mortality. Background A global, single-arm, open-label trial of Drotrecogin alfa (activated) (DrotAA) in adult patients with severe sepsis was subsequently conducted at 361 sites in 25 countries (ENHANCE, n = 2378). Reported here are 28-day all-cause mortality and safety data. Methods Inclusion/exclusion criteria and the definition of serious adverse events (SAE) were similar to PROWESS. Patients at high risk of bleding were excluded from participation. SAE bleeding rates, including intracranial hemorrhage (ICH), during the study drug infusion period (infusion period + 1 day) were determined and the all-cause mortality was assessed at day 28. Serious bleeding rates and fatal event rates for PROWESS and ENHANCE are presented in Table 1 . Overall 28-day mortality: PROWESS placebo = 30.8%, PROWESS DrotAA = 24.7%, ENHANCE DrotAA = 25.3%. As in PROWESS, bleeding was the most common drug-related complication associated with DrotAA. A total 48.9% of SAE bleeding events were adjudicated as procedure-related. In ENHANCE, the ICH rate during infusion was 0.6% (n = 15), of which five (0.2%) were fatal. ICH rates for placebo vs DrotAA in PROWESS were 0% and 0.2% (n = 2, both fatal), respectively. A higher postinfusion rate of serious bleeding was observed in ENHANCE, suggesting a higher overall background bleeding rate, relative to PROWESS. Conclusions Twenty-eight-day survival was similar in PROWESS and ENHANCE. Serious bleeding events during infusion of DrotAA were slightly more frequent in ENHANCE than in PROWESS; the proportion of fatal bleeding events was the same. However, the background bleeding rate in ENHANCE may have been higher. The ENHANCE safety and efficacy data are highly consistent with PROWESS, and reinforce the favorable benefit-risk profile of DrotAA in patients with severe sepsis. Acknowledgement This research was supported by Eli Lilly and Company, Indianapolis, IN, USA. Available online http://ccforum.com/supplements/8/S1 Methods Inclusion and exclusion criteria were similar to PROWESS. Patients eligible for participation had a known or suspected infection, met three or four criteria defining systemic inflammatory response syndrome and one or more acute sepsisinduced (< 48 hour duration) organ dysfunctions. Days in the ICU, days in the hospital, and days of ventilator use were monitored, starting from time of DrotAA infusion. Patients were classified with respect to the time interval from the first documented organ dysfunction to administration of DrotAA in the intent-to-treat population with two or more organ dysfunctions at baseline that had received DrotAA within 24 hours after the first organ dysfunction (time 0-24 hours, n = 894) vs more than 24 hours after first organ dysfunction (time 24-48 hours, n = 1110). Results Table 1 describes patients with two or more organ dysfunctions at baseline. Administration of DrotAA within 24 hours of the first organ dysfunction was associated with reductions in days in the ICU, hospital length of stay, and mechanical ventilator use, relative to patients receiving DrotAA from 24-48 hours after first organ dysfunction. Conclusion Earlier treatment with DrotAA was associated with reduced hospital resource use and decreased length of stay in patients with two or more organ dysfunctions at baseline. These data suggest that the timely administration of DrotAA may have important clinical and economic value. Acknowledgement This research was supported by Eli Lilly and Company, Indianapolis, IN, USA. Methods Inclusion and exclusion criteria were similar to PROWESS. Patients eligible for participation had a known or suspected infection, three or four of the criteria defining the systemic inflammatory response syndrome, and one or more acute sepsis-induced (< 48 hour duration) organ dysfunctions. Days in the ICU, days in the hospital, days of ventilator use, and days of vasopressor use were monitored, starting from the time of DrotAA infusion. Patients were subclassified with respect to the time interval from the first documented organ dysfunction to administration of DrotAA (time-to-treatment). Time-to-treatment was considered in the intent-to-treat population with an APACHE II score of 25 or more at baseline that had received DrotAA within 24 hours of the first organ dysfunction (n = 430) vs those that received the drug more than 24 hours after first organ dysfunction (n = 432). Results Data presented are from severe sepsis patients with an APACHE II score of 25 or more at baseline. The administration of DrotAA within 24 hours of the first observed organ dysfunction was associated with significant reductions in days in the ICU (median ICU days = 10 for time: 0-24 hours, 14 for time: 24-48 hours; P = 0.006), mechanical ventilator use (median ventilator days = 7 for time: 0-24 hours, 10 for time: 24-48 hours; P < 0.001), and vasopressor use (median vasopressor days = 2 for time: 0-24 hours, 3 for time: 24-48 hours; P = 0.003), relative to patients receiving DrotAA after more than 24 hours from first organ dysfunction. Median days in the hospital were lower in patients receiving DrotAA within 24 hours of first organ dysfunction, but the difference was not statistically significant (median hospital days = 18.5 for time: 0-24 hours, 22 for time: 24-48 hours; P = 0.103). Conclusion Earlier treatment with DrotAA was associated with reduced hospital resource use in patients with an APACHE II score of 25 or more at baseline. These data suggest that the timely administration of DrotAA may have important clinical and economic value. Introduction It has been shown that age and number of organ dysfunctions (OD) correlate with outcome of patients suffering from severe sepsis (SS). Recently, the PROWESS study (phase III, double-blind, placebo-controlled trial) had demonstrated that recombinant human activated protein C (rhAPC) reduces the absolute risk of 28-day all-cause mortality by 6.1% in patients suffering from SS. As a result in Mexico, from August 2002 until October 2003, 206 patients have been treated with rhAPC. The present study is to identify the main variables that contribute to early mortality (first 8 days) in patients treated with rhAPC. Methods Data from 206 patients treated with rhAPC from 58 public and private medical institutions all over Mexico were analyzed retrospectively. Age, origin of SS, number of OD, timing of rhAPC administration, andcomplications during rhAPC were used as independent variables. Cross-tabulation and chi-square analysis were performed to identify those variables that contribute to mortality at 8 days of SS. Results Intra-abdominal sepsis was the most common origin of SS (46.6%), followed by postoperative sepsis, pneumonia and others (28.6%, 19.4% and 5.3%, respectively). There were no differences in mortality about patients treated in public hospitals compared with patients treated in private hospitals. The mean age of patients treated was 58.80 years (19-89 years). The mean of OD was 2.78 (1-6), the mean of timing administration of rhAPC was 50.94 hours (4-504 hours), and global mortality at 1 week was 31.6%. Age was associated with mortality; patients younger than 41 years had 14.7% mortality, in contrast with patients older than 65 years with 48.3% mortality (P = 0.000). The number of OD was associated with mortality; patients with one OD had 21.4% mortality in contrast with 37.7% mortality in patients with three or more OD (P = 0.040). Patients younger than 41 years with one OD had 0% mortality, two OD 11.8% and three or more OD 21.4%; patients between 40 and 65 years with one OD had 25% of mortality, two OD 20.0% and three OD 33.3%. Patients older than 65 years with one OD had 25% of mortality, two OD 50% and three or more OD 50% (P = 0.003). Bleeding was observed in 4.7% of patients, and one dead by cerebral haemorrhage associated with rhAPC (0.48%) was reported. Conclusions Age and the number of OD are highly associated with mortality at 1 week of SS on patients that were treated with rhAPC. Patients suffering from SS have to be treated at an early stage of sepsis. Introduction Neuroaxial anaesthesia reduces the risk of postoperative complications, but in high-risk patients undergoing major abdominal surgery the benefits of neuroaxial anaesthesia are less clear [1, 2] . A significant proportion of patients develop sepsis following laparotomy. Recently Drotrecogin alfa (activated) has been shown to reduce the risk of death from sepsis [3] . Due to the increased risk of bleeding the presence of a neuroaxial catheter is an absolute contraindication to its use. We hypothesized that the presence of a neuroaxial catheter excludes a significant number of septic patients from receiving this potentially beneficial therapy. Method All patients admitted to our unit over a 1-year period following major intra-abdominal surgery were reviewed. Among those with neuroaxial catheters, data were collected on systemic inflammatory response syndrome (SIRS) criteria, presence of infection, organ failure and hospital survival. Sepsis was defined according to the Critical Care Medicine Consensus Committee. Results See Table 1 . Sixty-seven patients were admitted following major abdominal surgery. Of these, 57 (85%) had neuroaxial catheters. Of these, 14 developed sepsis (25%). The mortality in this group was 8/14 (57%). In our unit, patients receiving Drotrecogin alfa had a mortality of 29% (expected 53%). The mortality in patients with sepsis in our group undergoing major abdominal surgery was comparable with that observed in the PROWESS study (0.57 vs 0.52) [3] . Given that Drotrecogin alfa has been shown to reduce the risk of death by 6% and that neuroaxial anaesthesia confers little survival benefit following major abdominal surgery, avoiding the use of or early removal of catheters in sepsis may confer a significant increase in survival. Available online http://ccforum.com/supplements/8/S1 Methods The Council of the SICS agreed that an audit of the use of Xigris © be conducted once it was licensed. The Scottish Intensive Care Society Audit Group (SICSAG) implemented this study on behalf of the SICS. Information manuals containing the guidelines and data collection methodology were introduced throughout Scottish ICUs. The guidelines were also made available in an interactive format on the ICU audit system (Ward Watcher, Critical Care Audit Ltd, Yorkshire, UK) and provided clinicians with an electronic tool to help determine fulfilment of criteria. Pharmacists and ICU staff informed the SICSAG whenever the drug was administered. Attempts to review records of all patients receiving the drug in the first 10 months were made by a nurse from the audit group. Conclusion Early use of this drug has been stable month-onmonth. Overall use has been less than anticipated but has shown marked variation between units. Consultants generally follow those aspects of the guidelines relating to SIRS criteria and organ dysfunction but seem reluctant to base prescribing decisions on APACHE score. Method Fifteen patients who received Drotrecogin alfa were each matched for ICNARC diagnostic category and APACHE II score with two historical controls. These were selected from patients admitted prior to the availability of Drotrecogin alfa. Following successful matching, the mortality was unblinded. The two groups (control and Drotrecogin alfa) had comparable APACHE II scores (25.5 vs 25.7) and risk of death at admission (0.52 vs 0.54), respectively [2] . The mortality was 60% for historical controls and 27% for those receiving Drotrecogin alfa. The standardised mortality ratios were 1.14 and 0.49, respectively (chi-squared P < 0.05). Conclusion Protocol-driven Drotrecogin alfa treatment in septic patients appears to confer a survival benefit in a UK district general intensive care unit. The absolute mortality reduction of 33% was considerably higher than that in PROWESS [1] . Objectives To assess the epidemiology and prognosis of thrombocytopenia (TP) in ICU patients. Design A monocentric retrospective analysis of data prospectively collected in a databank. Setting A 10-bed adult medical ICU in a university hospital. Patients Three hundred and twenty-five consecutive ICU patients admitted for > 24 hours between January and December 2002 were included. Measurements and main results TP was defined by two consecutive platelet counts < 150 G/l. The main outcome measure was ICU mortality. The main reason for ICU admission of TP patients were acute respiratory failure (n = 37), shock (n = 21), sepsis (n = 18), coma (n = 13), acute renal failure (n = 12), metabolic disorders (n = 9), scheduled surgery (n = 5), acute poisoning (n = 4), trauma (n = 3), miscellaneous causes (n = 4). In the overall population, the mean platelet count upon admission was 314 ± 114 G/l. TP was observed in 126/325 (39%) and present on admission in 89 patients, with a mean count of 97 ± 340 G/l. ICU-acquired TP occurred with a median delay of 4.7 days (range 2-26 days) in 37/126. The ICU mortality was 29% (n = 37) in the TP population and 9% (n = 19) in the nonTP population. Bone marrow aspiration was performed in 35/126 TP patients. Before TP, a previous red blood cell (RBC) transfusion was recorded in seven cases, a previous plasma infusion in two cases, and prior exposure to heparin in 25 cases. The diagnosis of heparin-induced TP was excluded in 17/25 cases, unlikely in three cases, possible in three cases and very likely in two cases. TP was related to sepsis in 51 cases, to disseminated intravascular coagulation in 28 cases (among these, 14 were associated with sepsis), to central megacaryocyte depletion in five cases, and to hemophagocytic histiocytosis in four cases. A combination of two or more of these mechanisms was observed in 23 cases. In 57 patients, the etiology of TP was undetermined or related to another cause. In univariate analysis, TP was significantly associated (all P < 0.01) with higher SAPS II score (50 ± 22 vs 40 ± 20), higher SOA score (8.6 ± 4.3 vs 4.7 ± 4), need for renal replacement therapy (39% vs 20%), need for invasive mechanical ventilation (51% vs 36%), higher length of stay (11 ± 13 days vs 7 ± 6 days) and ICU mortality (29% vs 9%). No statistical difference was detected in patient characteristics between the two TP populations (TP present on admission or ICU acquired). Twentyeight TP patients required platelet transfusions, and 81% of the 311 RBC concentrates and 85% of the 70 plasma concentrates administered during the study were infused to TP patients. The presence of TP indicates a subgroup of patients with high mortality, which accounts for most of the blood product-related costs in the ICU. Objectives To study prospectively the incidence of complications and the financial impact derived from CVC placement in patients with BD without its previous correction. Methods A total of 203 CVC in 121 patients were placed during a 32-month period. BD were defined as a prothrombin time (PT) of less than 50%, and/or an activated partial thromboplastin time (aPTT) of 50 s or more (range 35-45 s), and/or a platelet count (PC) < 100,000/mm 3 . External or internal bleeding were considered CVC placement-related complications. The internal jugular vein, subclavian vein and femoral vein were the chosen insertion sites. Procedure-related transfusion requirements, surgical corrections or lengthening of hospital stay were defined as major complications. Fresh frozen plasma (FFP) and platelet units (PTL) required for correction of BD were estimated by standard formulas. The mean age was 47.7 years (14-74 years) with a male/female ratio of 1.7. The internal jugular vein was the insertion site in 80.3% of patients. The APACHE II score was 16.5 ± 7.8 (2-40). Three operators performed 91.6% of the procedures. There were no mortality or major complications associated with CVC placement, with 10 patients showing local hematomas. Results for PT (%) were 50 ± 24.8 (1-114), for aPTT (s) 53.6 ± 29.9 (16-180) and for PC (/mm 3 ) 115,571 ± 89,516 (5000-475,000) (mean ± SD). The average PC in patients with thrombocytopenia was 56,133/mm 3 . FFP and PTL units saved, and local and US saved charges according to published payment rates are depicted in Table 1 . The need for coagulopathy correction and the levels of coagulation tests to be considered safe for CVC placement are still a matter of debate. In our study population no previous correction was shown to impose an increased risk. Available online http://ccforum.com/supplements/8/S1 Background Thrombocytopenia may be a marker of severe illness or a predictor of poor outcome in critically ill medical-surgical ICU patients. Objective To estimate the prevalence, incidence, risk factors for, and consequences of thrombocytopenia (defined as platelets < 150 × 10 9 /l). heparin-induced thrombocytopenia tests (serotonin release assay); none were positive. Patients who ever developed thrombocytopenia versus those who did not had a long stay ICU stay (12 days vs 8 days), but a similar hospital stay (30 days vs 23 days), ICU mortality (31% vs 24%), and hospital mortality (46% vs 34%). A platelet count < 150 × 10 9 /l was not independently predictive of mortality after adjusting for age and illness severity (HR = 1.0 [0.6-1.7]), whereas platelet count < 50 × 10 9 /l was (HR 8.3 [3.8-18.3 Conclusions In medical-surgical ICU patients, thrombocytopenia is common, associated with an increased duration of ICU stay, but not an increased risk of mortality until the platelet count is < 50 × 10 9 /l. The trauma database at Sunnybrook and Women's College Health Sciences Centre, University of Toronto, Toronto, Canada, a regional, level one, trauma centre. Methods Two hundred and seventy-five consecutive patients receiving 10 or more units of packed red blood cells in the first 24 hours of admission to the hospital, from January 1992 to December 2001, were reviewed from the trauma database. The criteria used to determine the appropriateness of transfused blood products in the first 48 hours of their admission were according to the American Society of Anesthesiologists [1] : hemoglobin > 80 g/l, platelet count > 50 × 10 9 /l, International Normalised Ratio (INR) < 1.5, fibrinogen > 1.0 g/l. Inappropriately transfused units of blood products were counted as those blood units administered despite the last laboratory reading meeting the aforementioned criteria. Each assessment period time frame and clinical action extended from 15 min after the last laboratory data until 15 min after the next laboratory data. Data collection Laboratory data (hemoglobin value, platelet count, INR, fibrinogen level) and transfusion data (packed RBC, PLT, and FFP, CRY) in the first 48 hours of admission were collected. Results See Table 1 . Conclusions There may exist substantial inappropriate blood product transfusion in the treatment and resuscitation of trauma patients. However, it is also possible that routine laboratory measurements are not sensitive enough indicators of appropriateness for this patient population; that is, patients frequently may have active bleeding without the opportunity for laboratory samples to be obtained. The potential causes for the former include lack of knowledge of the existing guidelines, decision of not to follow the guidelines, or overcorrection of coagulopathy. Possible solutions include: the redistribution of guidelines to various departments of a hospital (operating room, emergency room, critical care); cell saver techniques; accepting lower blood pressures in select patients; use of antifibrinolytics; use of factor VIIa. Regrettably, the trauma patient population is a heterogeneous group of patients and, similar to therapeutic trials in sepsis, trauma patients represent a group with an extremely high morbidity and mortality incidence that can potentially complicate interpreting data. Bovine hemoglobin represents an interesting alternative because of its availability and large quantity. A 500-kg steer has approximately 35 l blood with 12 g/dl hemoglobin (total-body hemoglobin content 4.2 kg), and bovine blood is available as an unlimited supply. Hemopure, a bovine product, is approved for veterinarian use in the United States, and for humans in South Africa. Multicenter, randomized, phase III, controlled trials have demonstrated the safety and efficacy of HEMOPURE ® (Biopure Corporation), also known as HBOC-201. The product is ultrapurified to remove any plasma proteins, RBC stroma, and potential pathogenic material. During the manufacturing process, cross-linking and polymerization stabilize the hemoglobin molecule, which increases its vascular persistence as well as the efficiency of oxygen transport to tissue. The mean cost per patient using the rhEPO strategy was $1030 vs $1323 for no rhEPO. Fewer units of PRBCs were administered in the rhEPO arm, resulting in a lower infection rate. rhEPO use was therefore both less expensive and more effective (i.e. it was the dominant strategy). One-way sensitivity analyses revealed that administration of rhEPO remained the preferred strategy provided that the PRBC cost was greater than $186/U or that the rhEPO cost was less than $1455/40,000 U. These results can be generalized: provided that the cost of rhEPO < 2.4 × cost 1 U PRBC, the rhEPO strategy will be preferable. Further, the advantage of treatment with rhEPO was maintained down to an expected transfusion rate of 1.1 U PRBC per patient per admission. Decreasing the probability of infection or increasing the long-term cost of infection did not significantly affect the model. The management of chronic anemia among ICU patients admitted > 48 hours using rhEPO appears to be costeffective, provided that the rhEPO cost is less than 2.4 times the cost of 1 U PRBC. This effect is retained at low expected transfusion rates and with changes in the cost or risk of infection arising from blood administration. Results Functional capillary density was significantly higher in animals that received HES (994/mm 2 vs NaCl 826/mm 2 , P ≤ 0.001), whereas erythrocyte velocity was not altered considerably in either group. Histologic grading revealed severe mucosal atrophy in the NaCl but not in the HES group (P < 0.001). The number of rolling leukocytes was notably higher in NaCl as compared with HES animals (26/min vs 50/min, P ≤ 0.05). The results show beneficial effects of HES on microvascular perfusion and mucosal cellular integrity of the gut. A possible mechanism may be the inhibition of the leukocyteendothelium interaction in the mesenteric circulation. Objective The aim of this study was to investigate the effect of fluid resuscitation with 4% modified fluid gelatine (GEL) versus 6% hydroxyethyl starch (HES) on extravascular lung water (EVLW) and oxygenation in patients with septic shock and acute lung injury (ALI). Design A prospective randomized double-blinded clinicial trial. Setting A 20-bed intensive care unit in a university hospital. Patients Thirty hypovolemic patients (intrathoracic blood volume index [ITBVI] < 850 ml/m 2 ) in septic shock with ALI were randomized into HES and GEL groups (15 patients each). Interventions For fluid resuscitation 250 ml/15 min boluses (maximum 1000 ml) were given until the end point of ITBVI > 900 ml/m 2 was reached. Repeated haemodynamic measurements were performed at baseline (tb), at the end point (tep) then at 30 and 60 min after the end point was reached (t30, t60). The xardiac output, stroke volume, EVLW, and oxygen delivery was determined at each assessment point. For statistical analysis twoway ANOVA was used. The ITBVI, cardiac index, stroke volume index, and oxygen delivery index increased significantly at tep and remained elevated for t30 and t60, but there was no significant difference between the two groups. EVLW, although moderately elevated, remained unchanged, and there was no significant difference between the groups Introduction Acute hemorrhagic necrotizing pancreatitis (AP) is still an essential problem with a high mortality. Disturbed microcirculation and as a consequence decreased tissue oxygenation is a crucial effect in the cascade from the self-limiting mild edematous to the often fatal form of AP. The aim of the study was to evaluate the therapeutic approach of isovolemic hemodilution (IHD) with hydroxyethyl starch (HES) alone and in combination with HBOC-301 and with Ringer's solution with regard to pancreatic tissue oxygen tensions and survival in pigs suffering from acute pancreatitis. Methods After approval of the local ethics committee, 39 pigs were anesthetized, endotracheally intubated and normoventilated (FIO 2 = 0.3). After laparotomy and splenectomy the pancreatic duct was cannulated. The pancreatic tissue oxygen tension (tpO 2 ) was measured using a silicon catheter (Licox Medical Systems, GMS, Germany). After 30 min equilibration, AP was induced by a combination of intravenous cerulean and intraductal glycodeoxycholic acid. Fifteen and 75 min after induction of AP, animals were randomized and isovolemically (PAOP constant) hemodiluted with 10% HES 200.000/0.5 plus HBOC-301 (+ 0.6 g/dl plasmatic hemoglobin; Oxyglobin ® , Biopure, USA) (HES/HBOC group), 10% HES 200.000/0.5 (HES group) or Ringer's solution (RINGER group) to a hematocrit (Hct) of 15%. tpO 2 was measured every hour 30 min after reaching Hct of 15%. After 6 hours catheters were removed, the abdomen was closed and animals were extubated. After 6 days (144 hours) animals were sacrified. Statistical analyses were performed using the Kolmogorov-Smirnov test for normal distribution, Student's t test was used for normally distributed data and the chi-square test for the survival rate (significance, P < 0.05). Results After induction of AP, tpO 2 decreased in all groups and increased significantly after IHD with HES/HBOC in comparison with the HES and RINGER groups (P < 0.001). tpO 2 remained low in the HES and RINGER groups and decreased after 5 hours (P = 0.046) and 6 hours (P = 0.015) in the Ringer group in comparison with the HES group. The survival rate at the end of the observation period was higher in the HES/HBOC group (10/13) (P = 0.002) in comparison with the RINGER group (2/13) (P < 0.001), and between the HES (8/13) vs RINGER groups (P = 0.016). Only IHD with HES plus HBOC-301 was able to normalize pancreatic tissue oxygen tensions in comparison with IHD with HES or Ringer's solution after induction of severe AP. The complementary strategy of IHD and additional application of HBOC-301 as combination of rheologic and O 2 -delivering therapy may represent a novel therapeutic option for the treatment of acute pancreatitis in the future. Objectives Patients with cirrhosis and tense ascites treated with paracentesis require repeated intravascular fluid loading with colloid solutions. In such patients, human albumin was suggested to be superior to a synthetic colloid for fluid loading, albeit at a higher cost. Thus, we started a 6-month trial aimed to compare the outcomes in patients with cirrhosis who were treated with albumin with the outcomes in those treated with bovine-derived polygeline. We also aimed to investigate hospital costs. This trial had to be prematurely interrupted due to theoretical concern regarding safety of bovine-derived products. Methods A multicenter, randomized, double-blind trial was conducted in patients with tense ascites who were randomly assigned to receive intravenously either 20% human albumin or polygeline. The prespecified primary end point was the time to a first liver-related event (a composite of death, episodes of recurrent tense ascites, renal impairment, hyponatremia, bacterial infection, encephalopathy, portal hypertensive bleeding, the occurrence of hepatocellular carcinoma, liver transplantation) during the in-trial period (i.e. the period during which patients had only received the assigned colloid). All events were blindly adjudicated. Results Thirty patients were assigned to albumin and 38 to polygeline. At baseline, all patients had therapeutic paracentesis. The duration of the in-trial period was higher in the albumin group than in the polygeline group (mean [± SD], 121 ± 64 days vs 77 ± 58 days; P = 0.004) due to lower frequencies of unblinded colloid infusions and of dropout in patients assigned to albumin. There were 24 liver-related first events in the albumin group (58.9 per 100 person-months) and 34 liver-related first events in the polygeline group (107.4 per 100 person-months; the hazard ratio for a first liver-related event with albumin therapy was 0.51 [95% confidence interval, 0.30-0.88]; P = 0.016). The median time for the first event was higher in the albumin group than in the polygeline group (20 days [95% confidence interval, 11-46 days] vs 7 days [95% confidence interval, 6-13 days]; P = 0.044). The median total hospital cost was lower in the albumin group than in the polygeline group (€1915 for a 30-day period vs €4612 for a 30-day period, respectively; P = 0.004). In patients with cirrhosis and tense ascites treated with paracentesis, human albumin is more effective than polygeline to prevent the occurrence of any first event related to liver disease. This beneficial effect seems to be associated with a decrease in hospitalization-related cost. Background Albumin with furosemide has been reported to improve cardiovascular stability and oxygenation in adult respiratory distress syndrome (ARDS) [1] . Method Fourteen patients meeting the ARDS criteria of PO 2 /FiO 2 ratio < 27 kPa (200 mmHg) plus bilateral X-ray shadowing without cardiac cause, were recruited at the Chelsea and Westminster Hospital, London. Oxygenation and cardiovascular parameters were measured before (t0) and 5 min (t5) and 240 min (t240) after rapid intravenous administration of 200 ml of 20% human albumin solution (BPL) with 20 mg furosemide. Results were analysed by ANOVA with Bonferonni/Dunn post-hoc significance testing, with P < 0.05 considered significant. Albumin concentrations were corrected for changes in plasma volume. Expressed as mean (SD). Both colloid osmotic pressure (COP) and albumin changed significantly, with significant immediate elevation at t5 that declined by t240, although elevation above baseline remained significant. Plasma albumin was 13(3.8) g/dl at t0, 17.4 (6.1) g/dl at t5, and 16.7 (4.7) g/dl at t240. COP was 16.1 (2) mmHg at t0, 19 (2.5) mmHg at t5 and 17.3 (2.4) mmHg at t240. Significant changes also occurred in both PO 2 /FiO 2 and central venous pressure (CVP) but significant increases at t5 were not sustained at t240. PO 2 /FiO 2 was 19.9 (4.1) kPa at t0, 22.3 (5.8) kPa at t5, and 20.6 (6) kPa at t240. CVP was 17 (6) mmHg at t0, 20 (5) mmHg at t5, and 18 (5) mmHg at t240. The heart rate and cardiac output were not significantly affected by albumin administration, although an increase in cardiac output correlated nonsignificantly with improvement in the PO 2 /FiO 2 ratio at t5 (correlation 0.43) and at t240 (correlation 0.40). Discussion Cardiovascular stability was unaffected by albumin and furosemide administration. An immediate improvement in oxygenation was not sustained at 4 hours. Introduction Recombinant factor VIIa (rFVIIa) is a prohaemostatic agent that has successfully been used for the treatment of patients with haematological malignancies and coagulopathy. It also has been used in postsurgical patients where the coagulopathy in combination with hypothermia and metabolic acidosis causes an increase of mortality early post operation. Recently reported has been the use of rFVIIa in patients with a normal coagulation system, undergoing major surgery in order to reduce the intraoperative bleeding and the need for transfusions. Here we present our experience on rFVIIa use in ICU patients. In the period from January 2002 to December 2003, 16 ICU patients received recombinant factor VIIa (10 men and six women, with the average age of 62.62 years). Of these 16 patients, 10 were admitted in ICU post operation and the reason for rFVIIa administration was severe bleeding, despite surgical intervention and massive blood component transfusions. rFVIIa was administered in six patients who had haematological malignancies, in five of them to reduce coagulation disorders and in one patient with coagulopathy, before undergoing an invasive therapeutic procedure in order to prevent bleeding. These patients were infused with 15-90 µg/kg body weight rFVIIa. To evaluate the treatment, the prothrombin time, the activated partial thromboplastin time (aPTT), and the fibrinogen level were measured, as well as number of blood components transfused before and after rFVIIa administration. The success of rFVIIa treatment was confirmed by clinical stabilization and outcome. Results See Table 1 for the mean values before and after rFVIIa infusion. There were no adverse effects observed. Bleeding was controlled in 14 patients (87.5%). From these 14 patients five died (35.71%) and the reason was multiple organ dysfunction. Two patients, despite the high doses of rFVIIa, died because of continuing bleeding (12.5%). These results show that after treatment with rFVIIa, coagulation parameters and blood requirements were all decreased. The use of rFVIIa in ICU patients is a safe and effective method to control life-threatening bleeding that cannot be managed surgically. Methods Two hundred and forty-five cirrhotic patients (Child-Pugh < 13; Child-Pugh A = 20%, B = 52%, C = 28%) with UGIB (variceal = 66%, nonvariceal = 34%) were equally randomized to receive eight doses of 100 µg/kg rFVIIa or placebo in addition to pharmacologic and endoscopic treatment. The primary composite endpoint was the failure to control UGIB within 24 hours postdosing, or failure to prevent rebleeding between 24 hours and day 5, or death within 5 days. Results Baseline characteristics were similar between the two groups. Results of primary and exploratory analyses are presented in Table 1 . In the subgroup of Child-Pugh B and C variceal bleeders, significantly fewer patients in the rFVIIa-treated group failed on the primary composite endpoint (P = 0.03) and the 24 hour-bleeding control endpoint (P = 0.01) compared with placebo. rFVIIa did not improve the efficacy of standard treatment in Child-Pugh A cirrhotic patients, and no significant effect was found when analysing all applicable patients. Incidences of adverse events including thromboembolic events were similar, and there were no significant differences in 5-day or 42-day mortality. Conclusions Exploratory analyses in Child-Pugh B and C cirrhotic patients indicated that administration of rFVIIa was safe and significantly reduced the proportion of patients who failed to control variceal bleeding. Further studies are needed to verify these findings. Conclusion According to other studies [2] , the mortality rate of severe burn patients remains high despite burn care having changed considerably. We report the severity of illness on admission to the ICU and the extent of burns as contributors to the mortality rate. Moreover, early excision of burns seems to be associated with better outcome. Introduction This prospective study examines the efficacy of the predicting power for mortality of two different prehospital scoring systems in major trauma. We present an improved MEES in combination with capnometry. MEES combined with capnometry (MEESc) is a new scoring system. In a prehospital setting, values of the MEES and capnometry (initial and final) were collected from each patient. We added the final values of partial pressure of end-tidal CO 2 (petCO 2 ) to the MEES scoring system and ranked them from 0 to 2 so that the final maximum sum of this scoring system would be 30 without any change in the minimum score being 10. This study was undertaken over 3 years (January 2000-March 2003) and included 58 consecutive patients hospitalized for major trauma (defined as Injury Severity Score > 15) requiring intubation at the roadside and in whom the prehospital petCO 2 had been recorded. Patients younger than 16 years old were excluded from the study. There were 48 males and 10 females. Methods Sixteen patients with SCH-B in middle and end stages underwent MARS therapy as the MARS treatment group. A retrospective review of 16 patients chosen from previous therapy data, also of SCH-B and clinically matched to the MARS group, were regarded as the standard medical therapy (SMT) group, who had been treated with general methods except MARS. The prognosis of the two groups were studied and the treatment effects on levels of serum total bilirubin, nonconjugated bilirubin and mean arterial pressure (MAP) were observed in the MARS group during a single 6-8 hour MARS treatment. The single MARS treatment contributed to significant decreases in levels of serum total bilirubin (from 547 ± 187 mmol/l to 312 ± 118 mmol/l, P < 0.05) and nonconjugated bilirubin (379 ± 134 mmol/l to 185 ± 68 mmol/l, P < 0.05), respectively, and MAP was increased from 72 ± 8 mmHg to 81 ± 7 mmHg (P < 0.05) during the MARS therapy. There were statistical differences in prognosis and survival between the two groups: in the SMT group, six of 12 patients in the middle stage survived (50%), while in the MARS group the survival for comparable middle-stage patients was 10/12 (83.3); for the end-stage subgroup comparison, all four patients died in the SMT group but the MARS therapy improved the survival to 25% (1/4) in its four patients. The overall survivals of these two groups with middlestage and end-stage SCH-B patients were 37.5% for SMT and 68.5% for MARS (P < 0.05). Conclusions MARS therapy effectively removed serum bilirubin and other albumin-bound toxins, increased the MAP, and inhibited the hepatocellular necrosis, thus contributing to organ protection. Results MARS treatments were associated with a significant reduction of albumin-bound toxins and various cytokines, as well as the water-soluble toxins. Most patients showed a positive response to the therapy, proven by an increase in prothrombin time activity and mean arterial pressure, and a decrease in hepatoencephalopathy grade and Child-Turcotte-Pugh index significantly. Eighty-four patients survived in hospital including those alive from the liver transplantation (56.4%). Survival of the acute liver failure, subacute liver failure and acute on chronic liver failure patients were 62.5%, 66.7% and 54.5%, and acute on chronic liver failure patients formed the majority (121/149-81.2%). Patients in end stage presented the largest among the subgroups of acute on chronic liver failure (65/121, 53.7%), and patients in the early and middle stages gained more favorable outcomes than those in C grade (91.7% and 75% vs 30.8%) ( Table 1) . Randomized controlled studies in China are ongoing to verify the optimal therapeutic results of MARS. Materials Nineteen patients (nine male/10 female) 15-73 years of age with sepsis who received continuous hemodiafiltration (CHDF) treatment were examined. The APACHE II score was 28.3 ± 0.4. Fifteen patients underwent mechanical lung ventilation, and 13 had inotropic support. The mortality rate was 43.9%. Hemoprocessor 'Prisma' kits with membrane AN69 and solutions from 'Hospal' Company (France) were used for CHDF. The mean duration of CHDF was 73.8 ± 7.0 hours with the filtration volume of 80.4 ± 1.6 l/day. Concentration of TNF-α during CHDF were 386.3 ± 111.6-429.3 ± 80.7 pg/ml. A significant amount of this cytokine was measured in all effluent samples (90.9-132.5 pg/ml). Daily TNF-α elimination was 8.92 ± 0.94 µg. Clearance was 80.2 ± 6.3 l/day, which was one-third of the blood volume that has been perfused through the hemodiafilter. We found a correlation between the volume of effluent and TNF-α elimination (r = +0.49 ± 0.04; n = 30; P < 0.05). The difference between TNF-α content in the plasma volume before and after hemodiafilter, at a mean blood speed of 150 ml/min, hematocrit of 26.7% and plasma flow of 110 ml/min, was 17,160 pg/min. At the CHDF speed of 55.6 ml/min, 6183 pg/min TNF-α was eliminated. The difference between TNF-α content in plasma and effluent volume was 10,977 pg/min, which acknowledges the receipt of significant absorption of this cytokine on the hemodiafilter membrane during the first 6 hours of the CHDF procedure. Conclusion Significant amounts of TNF-α can be eliminated using CHDF filtration and absorption, which is important in the absence of natural hepatorenal clearance during multiple organ failure. The hemodynamic pattern of sepsis characterized by a high cardiac index associated with low systemic vascular resistances and alterations on tissue oxygenation is caused by the release of mediators during the host-infecting microorganism interaction. We thought that removing those endotoxins and inflammatory mediators by plasmapheresis would improve hemodynamic changes and tissue oxygenation. After ethical approval for the trial was obtained, patients were prepared for plasmapheresis. The criteria for entry to the trial were sepsis as described by Bone and colleagues [1] . The severity of illness and mortality rates was classified and calculated using the APACHE II scale, MODS and SOFA. After obtaining the laboratory data and calculating the scores, hemodynamic assessment and tissue oxygenation calculations were performed using the thermodilution technique just before the first plasmapheresis. Plasmapheresis was performed twice substituting with fresh frozen plasma as a 1:1 ratio for the calculated plasma volume. The second plasmapheresis was performed after 48 hours. Hemodynamic assessment and tissue oxygenation calculations were repeated before the second plasmapheresis and at the end of the procedure (72 hours). The outcomes at 28 days were recorded. Statistical analysis was performed with SPSS version 11.0 for Windows (Chicago, IL, USA), values are expressed as mean ± SD and P < 0.05 considered statistically significant. We studied 10 patients treated in the ICU with plasmapheresis. The difference between the calculated mortality rate based on initial APACHE II scores (52 ± 22.1), and 28-day mortality of all patients (20%), was significant (P < 0.05). The enhancement of APACHE II (from 19.9 ± 6.6 to 12.6 ± 8.5), MODS (from 8.6 ± 3.6 to 5.2 ± 3.4), and SOFA (from 9.1 ± 2.4 to 5.5 ± 4.3) scores between the baseline and the end of the procedure was statistically significant (P < 0.01). The changes in hemodynamic parameters were not significant (Table 1) . Despite the improvements in oxygen delivery, oxygen consumption and oxygen extraction ratio being not significant before the first plasmapheresis, the decrease in plasma lactate level (48.2 ± 29.3) was statistically significant at the end of the procedure (20.7 ± 16.2) (P < 0.007), and correlated with the 28-day mortality. The change in plasma C-reactive protein from 124.5 ± 43.2 to 36.9 ± 20.4 was also significant (P < 0.05). As the sepsis is described as 'malign intravascular inflammatory' caused by mediators, the hypothesis is that removing those mediators will improve the therapeutic efforts. We thought that in the early stages of sepsis, plasmapheresis may be considered a therapeutic method even during progressive sepsis that is complicated with disseminated intravascular coagulation. The limitations of endotoxin hemoadsorption therapy (PMX-DHP) and the optimal time to start PMX-DHP were examined in patients with septic multiple organ failure with hypercytokinemia (IL-6 > 1000 pg/ml). This study included 66 patients with infectious systemic inflammatory response syndrome in whom IL-6 > 1000 pg/ml before PMX-DHP therapy. These subjects were separated into two groups, those who survived for more than 28 days after the start of PMX-DHP therapy (S group; 38 patients) and those who did not survive (N-S group; 28 patients). Severity of symptoms and background factors, hemodynamic parameters, PaO 2 /FiO 2 , endotoxin, cytokines (TNF-α, IL-6, IL-1ra), and vascular endothelial cell function-related markers (ICAM-1, ELAM-1, PAI-1) were examined before and after PMX-DHP. Statistical analyses were performed by chi-squared test for background factors, with Wilcoxon's signed rank test for comparison within a group, and Mann-Whitney's U test for comparison between groups. This study was approved by the IRB of Tokyo Medical University. The APACHE-II score was 22.9 ± 6.3 and 30.7 ± 8.9, and the SOFA score was 9.7 ± 3.6 and 12.6 ± 3.4 in the S and N-S groups, respectively, showing significantly higher scores in the N-S group. The number of days that had lapsed from the onset of shock to PMX-DHP initiation was 0.8 ± 0.8 days in the S group while it was longer (1.8 ± 1.2 days) in the N-S group. After PMX-DHP for 2 hours, the endotoxin level decreased from 26.1 ± 20.3 to 20.3 ± 24.5 pg/ml with statistical significance (P < 0.05) in the S group. In the N-S group, it tended to decrease from 23.6 ± 20.3 to 10.9 ± 17.2 pg/ml. These results suggest that PMX-DHP could save more lives in patients with septic multiple organ failure with IL-6 > 1000 pg/ml when applied early after the onset of shock. Objective Very high or sustained high levels of the inflammatory cytokines tissue necrosis factor (TNF) and interleukin (IL)-6 are believed to be responsible for adverse clinical effects in patients undergoing cardiopulmonary bypass (CPB). We explored, using a mathematical model, whether modulation of this response might be beneficial. We developed a mathematical model of the acute inflammatory response that was calibrated from rat endotoxemia and hemorrhagic shock data. The model accommodates a variety of initiators of acute inflammation, provides a dynamic profile of serum markers of inflammation over time, and expresses outcome in as global tissue dysfunction. Irreversible dysfunction is interpreted as death. We constructed a population of 100,000 cases that differed by level of initial stress and propensity to mount an inflammatory response. Initial stress was chosen to result in 4% cohort mortality and to last less than 6 hours, such as CPB. The intervention consisted of the removal of circulating TNF, IL-6 and IL-10 over a period of 6 hours, during which stress was inflicted and acute inflammation triggered. We equated the degree of removal of cytokines to that observed with the application of a biocompatible adsorbent polymer hemoperfusion column in endotoxemic rats. Results Death correlated to serum IL-6 and to a lesser degree TNF cumulative levels. Patients with the highest levels of IL-6 6-24 hours after the insult are those that will go on to die (Fig. 1) . Examination of the results show that, if the IL-6 levels were decreased by > 60% and TNF levels by > 50% in the period at or shortly after the CPB, over 99% of all patients would survive, compared with 96% in the control arm. Conclusions Global, nonspecific, reduction of inflammation improves outcome in simulations of an acute inflammatory challenge such as CPB. Available online http://ccforum.com/supplements/8/S1 Sepsis, complicated with disseminated intravascular coagulation (DIC), is the most frequent cause of mortality in the ICU despite improvements in patient care [1] . Plasmapheresis is a tool to remove endotoxins and inflammatory mediators responsible for pathophysiology. We design a study in a small group of septic patients complicated with DIC to evaluate the changes on coagulation parameters, severity of sepsis and outcome of plasmapheresis. After ethical approval for the trial and written informed consent was obtained, patients were prepared for plasmapheresis. The criteria for entry to the trial were sepsis as described by Bone [2] , and DIC [3] as described by Bick. The exclusion criteria were uncontrolled hemorrhagia, recent (< 48 hours) cardiac surgery, recent (< 48 hours) resuscitation, platelets < 20,000/mm 3 , positive human immunodeficiency virus serology, morbid obesity and pregnancy. After obtaining the laboratory data and scores first, plasmapheresis was performed using fresh frozen plasma as a 1:1 ratio for the calculated plasma volume for each patient. The second plasmapheresis was performed after 48 hours. The severity of illness and mortality rates was calculated using the APACHE II scale and the severity of DIC was calculated by the DIC score. The outcomes at 28 days were recorded. Statistical analysis was performed with SPSS version 11.0 for Windows (Chicago, IL, USA), values are expressed as mean ± SD and P < 0.05 considered statistically significant. Thirteen patients were treated in the ICU with plasmapheresis from 2002 to 2003. The difference between the calculated mortality rate based on initial APACHE II scores (59.6 ± 21.3), and 28-day mortality of all patients (20%), was significant (P < 0.05). The decrease in prothrombin time and activated partial thromboplastic time, based on initial results, were significant in 24, 48 and 72 hours (P < 0.05). The enhancement of the DIC score (from 6.2 ± 0.6 to 3.7 ± 0.9) between the baseline and the end of the procedure was statistically significant (P < 0.01). The level of D-dimer before the first plasmapheresis (1333 ± 972) was decreased (638.6 ± 252.9) significantly at the end of the procedure (P < 0.007). The increase in biological markers of sepsis (antithrombin, protein C and protein S activity) was statistically significant (P < 0.01). It was also found that the protein C activity was controversially correlated with 28-day survival (P < 0.02). We thought that repeated plasmapheresis in the early stages of sepsis may be considered a therapeutic method even during progressive sepsis that is complicated with DIC. Introduction High cutoff hemofilters are characterized by an increased effective pore size designed to facilitate the elimination of inflammatory mediators in sepsis. This study compares diffusive versus convective high cutoff renal replacement therapy (RRT) in terms of cytokine clearance rates and effects on plasma proteins. Methods Twenty-four patients with sepsis-induced acute renal failure were studied. A polyflux hemofilter with a cutoff point of approximately 60 kDa was used for RRT. Patients were randomly allocated to either continuous venovenous hemofiltration (CVVH) with an ultrafiltration rate of 1 l/hour (group 1) or 2.5 l/hour (group 2), or to continuous venovenous hemodialysis (CVVHD) with a dialysate flow rate of 1 l/hour (group 3) or 2.5 l/hour (group 4). IL-1ra, IL-1β, IL-6, tumor necrosis factor-α (TNF-α), and plasma proteins were measured daily. Results CVVH achieved a significantly higher IL-1ra clearance compared with CVVHD (P = 0.0003). No difference was found for IL-6 (P = 0.935). Increasing the ultrafiltration volume or dialysate flow led to a highly significant increase in IL-1ra and IL-6 clearance rate (P < 0.00001). Peak clearance was 46 ml/min for IL-1ra and 51 ml/min for IL-6. TNF-α clearance was poor for both RRT modalities. A significant decline in plasma IL-1ra and IL-6 was observed in patients with high baseline levels. Protein and albumin loss were highest during the 2.5 l/h hemofiltration mode. Conclusion High cutoff RRT is a novel strategy to clear cytokine more effectively. Convection has an advantage over diffusion in the clearance capacity of IL-1ra but is associated with higher plasma protein losses. Methods One hundred and ten single MARS treatments were performed with a length of 6-24 hours on 39 various pathogenic MODS patients (27 male/12 female) (see Table 1 ). The MARS therapy was associated with a significant removal of NO and certain cytokines such as TNF-α, IL-2, IL-6, IL-8, and lipopolysaccharide binding protein (LBP) (see Table 2 ), together with a marked reduction of other nonwater-soluble albumin-bound toxins and water-soluble toxins. These were associated with a, improvement of the patients' clinical conditions including hepatic encephalopathy, deranged hemodynamic situation and renal and respiratory function, thus resulting in a marked decrease of teh Sequential Organ Failure Assessment (SOFA) score and improved outcome: 16 patients were able to be discharged from the hospital or bridged to successful liver transplantation. The overall survival of 39 patients was 41%. We can confirm the positive therapeutic impact and safety to use MARS on various pathogenic MODS patients associated with elevated levels of NO and cytokines. in SLEDD. Figure 1 shows the mortality in the different SHARF categories. The ICU and hospital length of stay was higher in patients with RRT. Observed mortality was lower than expected in non-RRT but paralleled the expected in RRT. The interim results of an ongoing study show that the SHARF score, with parameters at 0 and 48 hours, has good predictive value in estimating prognosis in ARF patients. Overall mortality was 56.5%, and 59% needed RRT. Mortality was the same regardless of RRT used. Major problems lay in recruiting centers using both techniques equally. The randomisation rate is lower than expected. Available online http://ccforum.com/supplements/8/S1 Methods This is a retrospective study of 144 consecutive patients with a diagnosis of rhabdomyolysis seen in our Emergency Department over 44 months. A 'normal serum creatinine' was defined as less than 1.5 mg/dl (133 µum/l). Laboratory data included the initial and peak creatine kinase (CK), the anion gap, calcium, phosphate, potassium, blood urea nitroben (BUN), white blood cell count (WBC), urine toxicology screen, and hematocrit. A second hematocrit collected following hydration was compared with the first to approximate the percentage of initial volume depletion. A second creatinine was also taken following hydration (median 10 hours after admission). Patients with obvious renal insufficiency defined as a creatinine > 4 mg/dl (354 µm/l) and patients with peak CK < 1000 were excluded. Multivariate logistic regression analysis was performed using small STATA 7.0. For continuous variables, data were dichotomized based on an upper limit of normal values (phosphate, potassium, BUN) or low limit (calcium); and median values of the anion gap, hematocrit, percentage decrease in hematocrit, WBC, and CK. Significant independent predictors of a normal first creatinine are presented in Table 1 . The most powerful independent predictor of a normal post-hydration creatinine was a normal admission creatinine (odds ratio [OR] = 31, confidence interval [CI] = 6.6-145, P = 0.0001). An elevated BUN and a CK greater than 10,000 were negative predictors of a normal second creatinine (OR = 0.20, CI = 0.06-0.65, P = 0.008, and OR = 0.21, CI = 0.07-0.61, P = 0.004). Conclusion A normal admission creatinine was associated with WBC < 11.0, the absence of cocaine or amphetamine, less than 18% volume depletion, a normal BUN and a normal calcium. A normal first creatinine was a very strong predictor of a normal second creatinine after hydration. This suggests that otherwise healthy patients with uncomplicated rhabdomyolysis and a normal first creatinine who can hydrate themselves orally may be safely discharged from acute emergency care after correction of volume deficits. Results Plasma M peaked on average 0.6 ± 0.4 days before CK. In the 76% of the patients M and CK increased in parallel and both decreased exponentially T½ (M) 22 ± 10 hours and T½ (CK) 26 ± 7 hours ( Fig. 1.) In 17% of the patients an isolated increase in M was observed. In the majority of these patients abdominal complications was found. In 7% of the patients an isolated increase in CK was observed, typically in relation to mobilisation of the patients. ARF was treated with continuous venovenous haemofiltration. In the majority of patients with rhabdomyolysis, M and CK increase and decrease in parallel. Medium-sized molecules such as M are transported directly from the muscle to the blood, while large molecules such as CK are transported with the lymphatic system. Therefore an isolated increase in plasma CK might be observed following mobilisation. An isolated increase in plasma M ought to raise suspicion of abdominal complications. In patients with rhabdomyolysis, measurement of the nephrotoxic molecule M can be used to predict the risk of ARF. We routinely screen patients with severe muscular exertion for the presence of rhabdomyolysis. We evaluated the significance of a low initial creatine kinase (CK), and factors associated with a rising CK in this group. Methods Consecutive patients with rhabdomyolysis were indentified over 44 months. A subgroup analysis of those with acute exertional rhabdomyolysis and initial CK < 1000 U/l identified two groups: patients whose CK values remained stable, and those whose CK values rose to greater than 2000 U/l. No one was excluded. Student's t test and the Fisher exact test were used to analyze continuous and categorical variables, respectively. Results See Table 1 . Of 36 patients with CK < 1000 U/l, 15 (33%) had a rising CK. This was significantly associated with elevation of the anion gap and increased frequency of moderate to large urine test for heme. The stable CK group had a significantly higher blood urea nitrogen (BUN). In all cases the anion gap resolved with sedation and hydration, which is clinically consistent with acute lactic acidosis. Five of the 15 patients in the rising CK The aim of this study was to assess the efficacy of the renal replacement therapy adopted in our ICU for patients who were affected by acute renal failure after their admission to the ICU, and to evaluate the role of age in relation to renal function and surviving probability. During a period of 18 months (February 2002-July 2003), we treated 22 patients with acute renal failure by means of continuous renal replacement therapy. The patients, 17 males and 13 females were divided into two groups: one group of patients (group A) younger than 65 years; a second group (group B) older than 65 years. Group A was made up of 14 patients, 10 suffering from polytrauma and four from acute pancreatitis, and a mean age of 42 ± 9 years. The second group was made up of 16 patients, who underwent major surgical interventions, four after open heart surgery, four after gynaecological surgery and eight after abdominal surgery with a mean age of 76 ± 11 years. All the patients got into our ICU for acute respiratory failure and needed mechanical ventilation. Oliguria has been diagnosed when the urine output was less than 400 ml/day. The acute renal failure was due to hypotension and sepsis; the renal replacement therapy was started when an oliguria and/or a volemic overload were observed. All patients were treated with a slow low-efficient daily dialysis (SLEDD) single pass adapted to each patient with a low flow therapy for 10-12 hours in order to obtain a good haemodynamic stability and ensure an urea clearance of 45-60 l/day and Kt/V weekly > 6. Statistical analysis was made with ANOVA and logistic regression. The patients' ICU length of stay was 16.3 ± 5.9 days for group A and 24.4 ± 10.8 for group B. The SLEDD therapy lasted 10.3 ± 3.9 days for group A and 19.7 ± 7.1 days for group B. The Qb was 150-200 ml/min, the Qd 60-100 ml/min and the mean ultrafiltration 150 ml/hour. The caloric intake was 32.7 ± 7 kcal/kg/day with a proteic intake of 1.9 ± 0.5 g/kg/day. Six patients of group A (42.8%) and 11 patients of group B (68.7%) died. All the survivors recovered renal function. Sepsis is the most relevant reason for acute renal insufficiency and mortality in critically ill patients. According to our experience SLEDD can be considered a safe and efficacious treatment for these patients: it allows an aggressive volemic removal and an adequate nutritional support coupled with haemodynamic stability and uremic control. The logistic regression shows that age, severity of illness and amount of organs affected were independent risk factors for poor outcome. Acute renal failure (ARF) is common in critical care patients. One risk factor for ARF is age. However, access to dialysis is often limited in older patients, due to their worse prognosis. We report our experience in dialyzing old patients, and compare them with younger patients with ARF in our unit. Since September 1994, we recorded all dialyzed patients, treated with intermittent dialysis (IHD), continuous dialysis (CRRT) or both, depending on clinical decisions. For IHD we use a Fresenius 4008 E machine and polysulfone dialyzers. For CRRT, we first used a DM 08 Fresenius machine and, from 2001, a Diapact-CRRT Braun machine, with polysulfone hemofilters. Since 1994, we have had 163 patients, 68.4% males, mean age 60 (range 15-92) years. Mean APACHE II score at admission was 21.6. Eighty-three percent needed mechanical ventilation, 53% were primarily surgical and in 74% there was a relevant infection or sepsis. From the total, 52 patients (32%) were older than 75 years (mean age 83 ± 4 years). In this group, 79% were male, 83% needed mechanical ventilation, 54% received nephrotoxic agents, including radiocontrast, 58% were primarily surgical and 72% had sepsis. Compared with younger patients, older patients were more often males, and had higher APACHE score (23.2 versus 20.6, P < 0.05) and mortality (55.8% versus 40.5%, P < 0.05). In the older group, the main differences between survivors and nonsurvivors were the incidence of sepsis (79% in nonsurvivors versus 60% in survivors), and the type of patient (nonsurvivors 69% surgical versus survivors 43% surgical). Age, use of mechanical ventilation, and APACHE score were not different between survivors and nonsurvivors. Regarding modality of dialysis, CRRT was used in 13 patients, with 11 deaths (84%). In contrast, the mortality of younger patients treated with CRRT was 47%. As for ARF outcome, 100% of nonsurvivors maintained ARF at death. In contrast, none of the survivors required dialysis at discharge of the hospital. Throughout the years, dialysis was more often considered: 52% of the older patients dialyzed were admitted after 2001, and more often also, CRRT was used (only two CRRT before 2001 in the older group). In conclusion, ARF is common in old patients. Even though ARF and dialysis are associated with high mortality, still 44% of old patients submitted to acute dialysis survive, and are discharged free of dialysis. With time, more old patients will be admitted to ICUs and treated with dialysis. As other authors have concluded, age should not be used as an isolated factor for deciding to treat ARF with dialysis. The aim of this study was to compare the outcome of critically ill patients requiring RRT with and without HM as well as to assess whether the presence of HM is independently related to a higher mortality in this population. Conclusion Incidence of ARF is important in the PICU, and is associated with significant mortality. Risk factors are multiple and could be preventable. This is the first prospective epidemiologic study that focuses on pediatric ARF in PICU. A multicentric study is planned to confirm these results. ) has yet to be evaluated in a general ICU population. RIFLE defines three grades of severity of ARF on the basis of either urine output (U) or an acute increase in serum creatinine (C): risk = C × 1.5 or U < 0.5 ml/kg/hour × 6 hours; injury = C × 2 or U < 0.5 ml/kg/hour × 12 hours; and failure = C × 3, C ≥ 4 mg/dl, U < 0.3 ml/kg/hour × 24 hours, or anuria × 12 hours. We prospectively collected data for all patients admitted to an ICU from 1 July 2000 to 30 June 2001 at the University of Pittsburgh Medical Center, a tertiary hospital with > 120 ICU beds serving medical, surgical, trauma, neurologic, and transplant patients. We tested two hypotheses: (1) increasing RIFLE class corresponds to decreasing ICU occurrence and increasing hospital mortality; and (2) patients classified by C or U criteria have similar occurrence and mortality. We classified patients according to their worst class on C or U criteria. For baseline C, we selected the lower of the admission C and the predicted C (based on age, sex, and race, using the Modification of Diet in Renal Disease formula). Results A total of 5754 admissions and 5313 patients were evaluated. Occurrence and mortality are presented in Table 1 . Conclusions RIFLE criteria appear clinically sensible with increasing mortality. Unfortunately, C and U criteria alone yield different results, and occurrence decreases only when C criteria are used alone. Methods This is a post-hoc analysis of a randomized, prospective trial of patients presenting to an urban Emergency Department (ED) with severe sepsis and septic shock. Patients were enrolled in the study if they had two or more systemic inflammatory response syndrome criteria plus hypotension or lactic acidosis (> 4 mmol/l). Patients were randomized to either conventional care or EGDT that included resuscitation to goals of central venous pressure (CVP) between 8 and 10, mean arterial pressure (MAP) between 65 and 90, and central venous oximetry (ScvO 2 ) greater than 70%. Ten patients with ESRD were in the control group and eight in the treatment group. Baseline hemodynamic (heart rate [HR], MAP, CVP, ScvO 2 ) and severity of illness scores (APACHE, MODS, SAPS) were not different between groups. Seven out of 10 patients in the control group died compared with 1/8 in the EGDT group (P < 0.05). At 6 hours, the HR and CVP remained equal between groups, but a statistically significant rise in ScvO 2 and MAP was found between groups (P < 0.05). Additionally, the treatment group displayed a significant fall in lactate, APACHE, MODS, and SAPS scores (P < 0.05). The difference in ScvO2, lactate, and severity of illness remained significant at 72 hours (P < 0.05). Conclusion EGDT is both safe and efficacious for patients with ESRD, resulting in reduction of both morbidity and mortality. This improvement in outcome can be explained by eradication of global tissue hypoxia by aggressive resuscitation guided by goal-directed care. Although EGDT was performed for only 6 hours in the ED, the physiologic benefits persisted between groups at 72 hours. Objective Although the simple closure/omental patch (SC) has been standard procedure for perforated duodenal ulcer (PDU) in most institutes in Japan, this procedure was originally performed for poor-risk patients. Basic evaluation concerning the SC has been insufficient and the SC has been performed based on only experience. Gastrointestinal endoscopic examination (GF) makes it possible to evaluate such fine structures. However, it is not known whether the early endoscopic examination after SC is safe or not. The aim of this study is to clarify the macroscopic findings of the healing process after the SC for PDU. We perform the SC for patients with PDU, without stenosis, and without prominent ulcer ridge. We start medication of H2-RA (or PPI) just after operation, and oral feeding 4-5 days after operation independent of postoperative GF. Eleven patients with PDU who were treated with the SC underwent postoperative GF at the 4th-16th postoperative day and the healing process was examined. We do not perform radiographic examination of leakage from anastomosis. All 11 patients were informed that we could be convinced of the healing process with some unknown risk. In three of 11 patients, GF findings showed an active stage and a healing stage. In two of them, the surgical technique was thought to be insufficient (the distance between the stitch and the edge of the ulcer was insufficient), and in the other patient postoperative GF was performed on the 4th postoperative day because of transfer to another minor hospital. The other eight patients showed a scar phase with good granulation and without exposure of stitches. Iatrogenic perforation, bleeding, and severe abdominal pain during or after postoperative GF was not seen in all cases. Conclusion Postoperative GF in the early phase after SC is useful and safe for evaluating the healing process of the SC for PDU. We can start oral intake 1 week after this surgical procedure. The kinetics of the pancreatic hormone glucagon in patients with acute pancreatitis have not been investigated as carefully as those of insulin, in spite of its crucial influence on energy metabolism. In the present study, we studied the kinetics of glucagon and glucagon-related peptides assessed by radioimmunoassay. Furthermore, the molecular forms of these peptides were examined using gel filtration chromatography, and the glucagon processes in the pancreas and intestine in the early stage in patients with acute pancreatitis were investigated. Methodology Fourteen patients with acute pancreatitis were enrolled in this study. Eight had severe pancreatitis (group S) and six had mild pancreatitis (group M). Ten healthy volunteers were also enrolled as normal controls (group C). Serum levels of glucagon and glucagon-related peptides were assessed on the second admission day in groups S and M, and in an early morning fasting state in group C, using glucagon nonspecific N-terminal (glucagon-like immunoreactivity [GLI] ) and specific C-terminal (immunoreactive glucagon [IRG]) radioimmunoassays. The molecular forms of these peptides were also estimated using gel filtration chromatography. We then discuss the glucagon processes based on these findings. Serum GLI and IRG in groups S and M were significantly higher than those of group C (P < 0.01), while those in group S were also significantly higher than those in group M (P < 0.05). In all patients in groups S and M, except for only three in group S, a peculiar glicentin-like peptide (GLLP) (molecular weight about 8000) other than pancreatic glucagon was observed in IRG gel filtration chromatography, which was clearly absent from group C. The kinetics and processing of glucagon in patients with acute pancreatitis were quite different from those of healthy subjects. In patients with acute pancreatitis, the peculiar processing of glucagon proceeded in the intestine quite differently from ordinary glucagon processing either in the pancreas or in the intestine, generating a peculiar GLLP. Aim of the study Evaluating acute pancreatic patients, to confirm cytokine involvement in the appearance of pulmonary failure and to investigate the effect of somatostatin on some cytokines associated with lung injury. In a 1-year prospective clinical study we examined 22 patients with severe pancreatitis acuta. Eleven of them (group I) were treated early with somatostatin (0.1 mg every 8 hours for 7 days). The second group was treated with standard therapy. The pulmonary failure was assessed on the basis of radiographic findings, computerised tomography scans, bacteriological diagnosis of endotracheal aspirates and acid-base abnormalities. Serum concentrations of IL-1, IL-6 and TNF-α were determined during the time course of the study, as were white blood cell counts, amylases, transaminases, urea and creatinin. All 22 patients developed MODS and all laboratory parameters as indicators of SIRS were increased. Serum concentrations of IL-1, IL-6 and TNF-α were elevated in all patients, but with high significance difference (P < 0.03) in 14 patients who developed pulmonary failure (three from group I and all 11 patients from group II). The peak serum concentrations of cytokines were found in patients from group II with the most increased value for IL-6. Lung injury treated with mechanical ventilation (14 patients), pulmonary drainage (six patients) and even decorticating (three patients) correlated with increased serum value of IL-6. Changes in chest radiographs depended on the increased values of all cytokines and the more severe was lung injury, the higher were the values. Comparison between the patients from group I and those from group II showed significant differences in the values of cytokines, with the values being higher in group II. Serum concentration of cytokines were significantly higher (P < 0.05) in five patients who died (all of them with pulmonary failure) compared with those who survived. High cytokine values are associated with development of pulmonary failure and increased mortality at acute pancreatic patients. Initial aggressive treatment with somatostatin decreases cytokine release and may predict good outcome in these patients by reducing the appearance and severity of pulmonary failure. Methods Twenty-six normal volunteers, mean age of 24 years, 14 female and 12 male, in spontaneous breathing were submitted to abdominal circumference measurement. Thereafter, we obtained their RR, TV, FVC, MIP and MEP. Then, we repeated the measures after compression of abdomen using an external band to achieve -10% and -15% of the basal abdominal circumference. Then, we took out the band and measured the respiratory parameters again. Results See Table 1 . The abdominal compression of -10% and -15% with an external band decreased TV, MIP and MEP (the last one only with -5% compression) and did not affect RR and FVC. Leukocytes were counted and differentiated. Parameters assessed were the chemokine IL-8, myeloperoxidase (MPO), a measure for neutrophil activation, and albumin and alpha-2-macroglobulin, to measure lung permeability. All assessed parameters were significantly higher in abdominal fluid than in the other two compartments (up to 100-fold in cell counts and 1000-fold in IL-8 concentrations compared with blood; P < 0.0001). Leukocyte count, neutrophil count/percentage and IL-8 concentrations were also significantly higher in the lung compared with blood or plasma values (P < 0.05). The absolute number, percentages and activity (MPO concentration) of neutrophils in the lung were higher at all time points in peritonitis patients (even day 0) compared with short ventilated controls (P < 0.05). Alveolar membrane permeability, as measured by the relative coefficient of excretion (a ratio between albumin excretion and alpha-2-macroglobulin excretion ratio) was increased in peritonitis patients at all time points. Discussion Neutrophil accumulation and activation in the abdominal cavity and the pulmonary compartment in patients with peritonitis is likely to be compartmentalized. In various compartments of the body, the host defense systems vary in extent and nature. An impressive pulmonary response develops early in secondary peritonitis: Neutrophils are activated and lung permeability is increased above normal values. The values for IAP (mmHg) were 9.7 ± 3.2 (IGP) vs 9.8 ± 3.1 (IBP), and 72.1 ± 16.5 for APP. There was a good correlation: IBP = 0.85 × IGP + 1.6 (R 2 = 0.79, P < 0.0001). Bland and Altman analysis showed good agreement: IGP was almost identical to IBP with a mean bias of -0. Conclusion Estimation of IAP via IGP or IBP is feasible. The novel IGP method is less time consuming, fully automated (autocalibration), and allows a continuous trend. The FoleyManometer offers a cost-effective alternative. Both are accurate and reproducible. The COVA for the obtained parameters in sedated patients is around 15-20% in a 24 hour period and thus varies substantially. These variations may even be more pronounced in nonsedated patients. Therefore IAP and APP are continuous variables like any other pressure and should be monitored as often as possible during the day to adapt treatment accordingly. Background and goal of study Air tonometry is a common method to examine splanchnic hypoxia, which often is a predictor of severe clinical complications such as sepsis or renal failure. Urodilatine is one of the indicators of renal function. A retroperspective case-control study was conducted considering whether there is an inter-relation between splanchnic ischaemia and renal function. After approval of the local ethics commitee, 39 intensive care patients have been studied. The age of the patients diverged from 37 to 98 years (median age 64 years). Hemodynamic variables, including continuous cardiac output, air-tonometric variables via a nasogastric tube (Tonocap Datex, Helsinki, Finland), arterial lactate, urine production, heart rate, central venous pressure (CVP) and mean arterial pressure, were measured every hour. Furthermore, renal values, such as urodilatine, ANF and urine excretion have been assessed after 6, 7 and 8 hours. The increase of the CO 2 gap values within the first 6 hours was determined by curve estimation and correlated (bivariate correlation) with indicators of renal function using Spearman's rho and Pearson correlation. The CVP increase within the first 6 hours was assessed similar to the CO 2 gap increase. Results and discussion Between air tonometry values and renal function, a significant contiguity (P < 0.05) was found. A bivariate correlation level of -0.356 (P = 0.039) between the CO 2 gap increase and urine urodilatine assessed after 7 hours resulted. Spearman's rho was -0.358 (P = 0.044) between the CO 2 gap increase and urodilatine extraction, and was 0.407 (P = 0.014) between the CO 2 gap increase and urine excretion after 7 hours. The strongest correlation was found between the CO 2 gap increase and the urodilatine-creatinine ratio (-0.503, P = 0.012). There was no correlation between CO 2 gap increase and serum creatinine. There also was no appreciable inter-relation between CVP increase and CO 2 gap increase (Pearson = 0.000, P = 0.999; Spearman's rho = 0.047, P = 0.778). Furthermore the time of ventilation correlates positively with the CO 2 gap increase (0.326, P = 0.043). This study suggests the CO 2 gap increase to be a predictor for decreasing urodilatine values indicating renal failure. Therefore air tonometry is helpful, tendentially detecting deterioration of renal function as sign of multiorgan failure. Introduction EAP may lead to compromise of chest wall mechanics and worsening of lung mechanics and gas exchange. Temporary relief may further lead to reperfusion injury. We investigated the effects of EAP and reperfusion injury on respiratory system mechanics and gas exchange under pressure controlled ventilation (PCV) without and with spontaneous breathing. Methods In a porcine model, EAP was set to 30 cmH 2 O by CO 2 insufflation twice for 9 hours each, with a pressure relief of 3 hours between and at the end. Anesthetised pigs (46.3 ± 3.3 kg) received PCV (n = 10) or biphasic positive airway pressure (BIPAP) (n = 10) with positive end expiratory pressure of 5 cmH 2 O, allowing up to 20% spontaneous of total ventilation, and were randomly assigned to the control group (n = 4) without EAP or the EAP group (n = 6) in each mode. Measurements of lung mechanics and gas exchange were performed every 2 hours. In the control group, no changes were observed with time or between modes. For results of the EAP group see Table 1 . During the second EAP phase lung mechanics deteriorated further, and returned to baseline value after pressure relief only with BIPAP. In both modes, oxygenation after the last pressure relief was worse than baseline. Conclusion An EAP applied twice for 9 hours each worsened the chest wall and lung mechanics and impaired gas exchange regardless of the ventilation mode. These changes were not completely reversed after pressure relief. If spontaneous breathing was present during BIPAP, there was less impairment of lung compliance and oxygen delivery, but this did not lead to differences in gas exchange. Background Organ dysfunction attributable to intraabdominal hypertension, known as abdominal compartment syndrome (ACS), has been recognised as a source of morbidity and mortality in the ICU. Decompressive laparatomy is considered as the sole definite therapy of ACS. However, no randomised trial has been performed to assess the role of surgery. We studied the characteristics and outcome of critically ill patients with ACS treated nonoperatively, and focused on the natural course of organ dysfunction. We retrospectively studied patients with ACS (intraabdominal pressure [IAP] ≥ 18 mmHg and at least one organ failure) admitted to our medical and surgical intensive care unit over two consecutive years (2001) (2002) . We focused on organ failure at the time of the raised IAP and on its evolution in patients who did not undergo a decompressive laparatomy. Hemodynamic instability was defined as dependence on vasopressor therapy, respiratory failure as P/F < 300 and renal failure as creatinine level > 2 mg/dl or a 50% increase from baseline. We found 24 episodes of ACS in 23 patients that were managed nonoperatively. Mortality was 26% (6/23), patients that died had a higher APACHE II score on admission (28.5 ± 8.3 vs 20.8 ± 7.4 in the survival group [P = 0.04]), but the incidence of organ failure and the highest recorded IAP were not significantly different. At discharge, organ function returned to basic levels in all survivors (medium length of stay 13). Conclusion In a group of critically ill patients with ACS treated nonoperatively, the mortality was 26%. Patients that survived were discharged from the ICU without organ dysfunction. We conclude that selected patients with ACS may not require decompressive laparatomy, but will respond to nonoperative observation and supportive therapy. Introduction Splanchnic ischemia due to sepsis or hemorrhage is believed to be involved in the pathogenesis of multiple organ dysfunction syndrome. In previous studies performed in our laboratory we have studied microcirculatory blood flow of the small intestinal mucosa in a porcine model of septic shock [1] . We have demonstrated that microcirculatory blood flow (MBF) was redistributed from the jejunal muscularis towards the mucosa. The aim of this study was to measure changes in MBF in the jejunum during stepwise occlusion of the superior mesenteric artery (SMA). Methods Data from the sepsis group have been extracted from a previously published study [1] . Swiss landrace pigs (30 kg) were anesthetized and mechanically ventilated. Cardiac output (CO) was measured using thermodilution and SMA flow with ultrasonic transit time flowmetry. MBF was measured in the jejunal mucosa and muscularis using laser Doppler flowmetry. In the septic shock group (group S, n = 11), sepsis was induced by fecal peritonitis. In the SMA occlusion group (group O, n = 8), SMA flow was reduced in steps of 15% using an occluder. Results are presented as percent of baseline (mean ± SD, *P < 0.05 compared with baseline, † P < 0.05 mucosa vs muscularis). During septic shock, the CO, SMA flow and MBF of jejunal muscularis decreased to 56 ± 13%*, 47 ± 13%* and 36 ± 17%* † , respectively, while the MBF in jejunal muscularis remained virtually unchanged (86 ± 20% † ). In nonseptic animals, stepwise occlusion of the SMA to 42 ± 4%* of baseline produced a decrease of muscularis blood flow to 43 ± 12%* † while mucosal blood flow decreased less, to 65 ± 24%* † . Conclusion During reduced regional blood flow, microcirculatory flow decreased less in the intestinal mucosa than in the muscularis, suggesting redistribution of flow away from the muscularis towards the mucosa, both in septic and in nonseptic subjects. Blood flow redistribution appeared to be more pronounced in septic than in nonseptic animals. Objective Vasopressors are recommended for circulatory support during acute sepsis. They maintain blood pressure, but have differing effects on cardiac output (CO) and renal blood flow (RBF). In particular, they may impair RBF and increase the risk of renal insufficiency. The aim of the study was to compare the effects on RBF of five vasopressors, noradrenaline (NA), adrenaline (ADR), dopamine (DOP), phenylephrine (PE) and vasopressin (VP), in acute sepsis. In seven anaesthetized dogs CO and RBF were measured directly using flow probes. The mean artery pressure (MAP) was measured by cannulating the femoral artery. Systemic vascular resistance (SVR) was derived. Sepsis was induced by injecting Escherichia coli. Each vasopressor was randomly infused at a recommended rate (Table 1) for 30 min. Percentage changes in MAP, CO, SVR and RBF, compared with baseline, were calculated. Data were expressed as mean (SD) and statistical analysis was performed using one-group t tests. All five vasopressors increased MAP (P < 0.01), but their effects on CO varied, with ADR and DOP producing the greatest increases (P < 0.01). SVR was increased by NA, PE and VP (P < 0.01), and was decreased by ADR (P < 0.01). Whereas NA and DOP increased RBF (P < 0.05), RFB was decreased by ADR and PE (P < 0.05; Table 1 ). Conclusion Despite similar effects on MAP, vasopressors have different effects on CO, SVR and RBF. NA and DOP were the most effective vasopressors in preserving RBF during sepsis, while ADR and PE had negative effects. Vasopressor P178 Epinephrine induces tissue perfusion deficit in porcine endotoxin shock: evaluation by regional CO 2 content gradients and L/P ratios Background We hypothesized that epinephrine (EPI) as opposed to norepinephrine (NE) has selective adverse effects on visceral perfusion when used to treat hypotensive endotoxin shock. We specifically wanted to test the hypothesis that perfusion deficiency could be detected by regional venous to arterial CO 2 content gradients (as opposed to pCO 2 gradients). Conclusions EPI, unlike NE, induced an overall relative reduction of splanchnic blood flow and specifically reduced gastric wall perfusion as evaluated by regional CO 2 content gradients. The lactate gradient and L/P ratio over the gastric wall increased accordingly. These findings suggest that EPI should not be used as a first-line vasopressor in septic shock. Available online http://ccforum.com/supplements/8/S1 Methods A small segment of jejunal mucosa was exposed by midline laparotomy and antimesenteric incision in 16 anesthetized, paralyzed, and normoventilated pigs. PO 2 muc (Clark-type surface oxygen electrodes), HbO 2 (tissue reflectance spectrophotometry), and PU (laser-Doppler velocimetry) were measured. Systemic haemodynamics, mesenteric-venous acid-base and blood gas variables, and systemic acid-base and blood gas variables were recorded. Measurements were performed after a stabilizing period and baseline measurements at 20-min intervals during increasing dosages of AVP (AVP group; n = 8; 0.007, 0.014, 0.029, 0.057, 0.114, and 0.229 U/kg/hour) or saline placebo (CTRL group; n = 8). Results AVP infusion lead to a significant decrease in cardiac index (121 ± 31 ml/kg/min vs 77 ± 27 ml/kg/min) and systemic oxygen delivery (13.7 ± 3.0 ml/kg/min vs 9.1 ± 3.4 ml/kg/min) concomitant with an increase in systemic oxygen extraction ratio (31 ± 4% vs 48 ± 10%). AVP decreased significantly microvascular blood flow (133 ± 47 PU vs 82 ± 35 PU), mesenteric venous oxygen tension (25.6 ± 7.1 mmHg vs 6.9 ± 2.4 mmHg) and microvascular hemoglobin oxygen saturation (51.3 ± 9.0% vs 26.4 ± 12.2%) without a statistical increase in mesenteric venous lactate concentration (2.3 ± 0.8 mmol/l vs 3.4 ± 0.7 mmol/l). Conclusion Intravenously administered AVP decreases in a dosedependent manner intestinal oxygen supply and mucosal tissue oxygen tension via reduction in microvascular blood flow in healthy pigs. Introduction Hyperdynamic circulation of septic shock is characterized in the hyperdynamic phase by increased cardiac output and decreased peripheral vascular resistance, and consequently low pressure. The use of norepinephrine as a pure vasoconstrictor aims to increase vascular resistance and blood pressure. There is a subgroup of patients with septic shock that does not respond to usual or high doses of norepinephrine. Usually in these patients different therapeutic approaches are used. In this study, we try to evaluate the role of very high doses of norepinephrine (≥ 2 µg/kg/min) in the treatment of septic shock. In our department the therapeutic approach to septic shock includes the use of aggressive haemodynamic monitoring (pulmonary artery catheterization) and the use of norepinephrine and/or dobutamine combination along with fluid resuscitation. In this retrospective study we present 12 patients with septic shock that required very high doses of norepinephrine (≥ 2 µg/kg/min). All patients were catheterized with a Swan-Ganz catheter and received adequate fluid resuscitation (pulmonary wedge pressure ≥ 15 mmHg). Norepinephrine was used and titrated to maintain mean arterial pressure > 70 mmHg. In nine of 12 patients dobutamine was added to achieve cariac index > 3.5 l/min/m 2 . Results Two patients never recovered from septic shock and died. Another six patients expired at a later time from multiple organ failure despite the fact that they improved from the shock and they were weaned off norepinephrine during their course. Finally, four patients survived to be discharged from the intensive care unit after a mean 25 days of intensive care unit stay. The APACHE II score did not differ between survivors and nonsurvivors (Table 1) . The use of very high doses of norepinephrine in the treatment of septic shock under the guidance of aggressive haemodynamic monitoring may improve survival in this group of patients with very high mortality. We may need to reconsider labeling patients with norepinephrine-resistant septic shock if the usual doses are ineffective, and try high doses before switching to another regimen. Values are shown as mean (range). Materials and methods Twenty-six intensive care patients (five women and 21 men) were selected after surgery. All of them showed n increasing gastric-arterial PCO 2 (CO 2 gap) difference within at least 6 hours. An increasing CO 2 gap was taken as an indicator of decreasing splanchnic perfusion. Furthermore, each patient was matched with one postsurgery intensive care patient without an increase of CO 2 gap. Tonometric variables via a nasogastric tube (Tonocap Datex, Helsinki Finland), hemodynoamic variables, temperature and arterial lactate were measured every hour. The differences of the PCT (pct-diff), IL-6 (il_diff) and LBP (lbp_diff) values of day 1 after surgery and day 0 were also assessed. By means of the SAPS and APACHE score systems, the general condition of each patient was evaluated and afterwards enrolled into two groups (group 1, SAPS ≤ 5; group 2, SAPS > 5). Statistical analysis was done using covariance analysis (the time patients were breathed in the operating room was defined as covariate). The mean pct-diff was significantly higher (adjusted mean = 2.22, P = 0.01) for the group with increasing CO 2 gap than for the decreasing group (adj. mean = 0.59). Their was no significant difference in LBP values for both groups but a significant one due to the general condition of the patients (adj. mean SAPS group 1 = 20.22; adj. mean SAPS group 2 = 14.03; P = 0.13). In IL-6 values there were also no significant differences, but the mean for each group suggests that the il_diff values are even higher in the group with decreasing or stable CO 2 gap 1 day after surgery (adj. mean stable-group = 41.34, adj. mean increasinggroup = -134.6; P = 0.056). There were no other significant differences in lactate, cardiac index, blood pressure or mean arterial pressure between the two groups. Furthermore there was a tendential significant difference in ICU time. Patients with increasing CO 2 gap have been longer on the ICU than other patients (mean group with increasing CO 2 gap = 4642 min vs 2934 min for the other one; P = 0.094). The differences in the PCT values suggest that an increase in CO 2 gap measured by air tonometry is accompanied by a later increase in PCT values. Tonometry is the earlier method to detect aspects of inflammation. These findings fit with the longer time on the ICU for patients with an increasing CO 2 gap. Purpose To evaluate the morphology and vascularity of the gallbladder in ICU patients examined by ultrasonography (US) and color Doppler (CD) and contrast-enhanced (CE) study. To assess the value of the method in the diagnosis of AAC. We prospectively examined 50 concecutive patients who were admitted to the ICU with a variety of diagnoses. The patients were examined 7 days after admission to the ICU. Follow-up examinations were performed every 7-10 days. A total of 98 examinations were obtained in 50 patients. In each examination, the gallbladder was examined by US, CD and CE study with galactose-based microbubbles (SHU 508 A Levovist). Sonographic parameters were obtained (distention, wall thickening, containts, pericholecyctic fluid pericholecyctic edema) and vascularity (normal, abnormal flow signals) was estimated. The findings correlated with clinical and laboratory parameters and histology if cholecystectomy was performed. Four out of 50 patients (8%) had one sonographic abnormality while they were in the ICU, 7/50 (14%) had two abnormalities, 11/50 (22%) had three abnormalities, 22/50 had four or more abnormalities and only 6/50 (12%) had no sonographic abnormality. In 2/50 (4%) hypervascularisation was detected in CD and CE study. Only these two patients had surgically proved AAC. Conclusion GB abnormalities are frequently seen on US in ICU patients even if these patients are not suffering AAC. The sonographic critiria are not specific. CD imaging, especially after CE imaging, is useful for the detection hyperemia in the acute stage of GB inflammation and may improve the accuracy of the method in early diagnosis of AAC. Available online http://ccforum.com/supplements/8/S1 Objective To investigate factors that may predict outcome in patients with severe abdominal sepsis that required treatment in an intensive care unit (ICU). Design A retrospective record review of survivors and nonsurvivors, comparing clinical, laboratory, microbiological, and therapeutic data, to identify specific poor prognostic factors. Setting A tertiary referral centre. Of 54 patients studied there were 16 survivors (29.6%) and 38 nonsurvivors (70.4%). The nonsurvivors had a significantly longer stay in the ICU (22.3 days vs 7.9 days; P = 0.0042), had significantly more laparotomies per patient (4.1 vs 2.4; P = 0.024), had more patients with an open abdomen following surgery (15 vs 1; P = 0.01), had significantly more blood transfusions (10.8 vs 2.9; P = 0.0077), and had a significantly higher mean APACHE II score on admission (16.5 vs 10.7; P = 0.0175). The initial surgery was performed as an emergency procedure in 72% of patients, and of these four were delayed for greater than 48 hours. Of the survivors, the source was eradicated at the initial laparotomy in 12 of the 14 patients. Most of the patients that died developed multiorgan failure. In this regard none of the patients who survived required dialysis, whereas 44% in the nonsurvivors were dialysed. Nonsurvivors had more organisms isolated from the peritoneal fluid and more bacteremic episodes. The most frequently isolated organism from the peritoneal fluid in nonsurvivors was Escherichia coli. Conclusions Patients with abdominal sepsis are at risk of dying if there is delay in surgery and if the source of sepsis cannot be controlled at the first operation. In addition, more than four relook laparotomies, renal failure requiring dialysis, longer ICU stay, culture of pathogens from the peritoneal cavity and blood, higher APACHE II score, and requirement for blood transfusion contribute significantly. Background Originally known as a target molecule for plasmodium vivax and as a blood group antigen, DARC has emerged as a promiscuous binding site for chemokines. Besides red blood cells, DARC is also expressed on endothelial cells, even in Duffynegative individuals. However, the function of DARC during systemic inflammation remains unknown [1] . In a neutrophil (PMN)dependent model of LPS-induced ARF [2] , we sought to assess whether DARC plays a role in the development of organ failure during systemic inflammation. Methods ARF was induced by intraperitoneal injection of LPS in wild-type mice (WT) and DARC gene-deficient mice (DARC -/-). At 4, 12, and 24 hours after injection, blood samples were drawn, and kidneys were removed. Plasma creatinine (Crea) as well as blood urea concentrations served as indicators of renal function, and renal myeloperoxidase activity (MPO) as an indicator of total renal PMN content. Untreated WT and DARC -/constituted corresponding control groups. Data analysis included ANOVA with subsequent multiple comparison analyses when appropriate. Over 24 hours, WT developed severe intrarenal ARF with a more than threefold increase in Crea. DARC -/on the other hand displayed a significant protection from ARF, exhibiting only a small, clinically irrelevant increase in Crea over 24 hours (Fig. 1) . MPO was not different between WT and DARC -/-; both groups showed significantly elevated renal MPO after LPS injection, peaking at 4 hours (Fig. 2) and declining thereafter. Even after LPS injection, RT-PCR failed to detect DARC mRNA expression in the kidney of WT, whereas it could demonstrate cerebral expression of DARC mRNA at 12 and 24 hours after LPS administration. The aim of this study was to investigate the early cellular events in I-R injury using elective aortic surgery as a model. Twenty patients undergoing elective abdominal aortic aneurysm surgery were prospectively recruited. Ten underwent conventional open repair and 10 had endovascular aneurysm repair (EVAR). Mucosal biopsies of the sigmoid colon were taken immediately preoperatively and postoperatively. Microscopic examination was performed using H&E and TUNEL staining for apoptosis, and in situ hybrdisation techniques were used to detect IL-6 mRNA expression. Intraoperatively, systemic and splanchnic blood was sampled in the conventional surgery group. In the EVAR group only systemic blood was taken. Plasma was assayed for nitrosothiols (nitric oxide donors) and IL-6. There were no histological features of acute inflammation in the preoperative or postoperative biopsies in either group. H&E and TUNEL staining showed a 3.5-fold rise in apoptotic bodies following conventional surgery. There was no significant change following EVAR. IL-6 expression occurred in the colonic epithelium localized to the base of the crypts. Splanchnic blood showed a fourfold increase in nitrosothiols and a 22-fold increase in IL-6 within 15 min of reperfusion. Peripheral blood showed a five-fold increase in IL-6 (Student's t test P < 0.05). There is a significant increase in IL-6 release both from the colon and in the systemic circulation following reperfusion, which is accompanied by an increase in nitrosothiols. The colonic epithelium is a source of IL-6. Apoptosis, rather than necrosis, has been shown to be the principal mode of cell death in early I-R injury. Introduction Risk stratification of severely ill patients remains problematic, resulting in increased interest in potential circulating markers, such as cytokines, procalcitonin and brain natriuretic peptide. Recent reports have indicated the usefulness of plasma DNA as a prognostic marker in various disease states such as trauma, myocardial infarction and stroke. To our knowledge, plasma DNA has not been investigated in the setting of the critically ill patient in the intensive care setting. Methods Fifty-two consecutive patients were studied in a general intensive care unit. Blood samples were taken on admission and stored for further analysis. Plasma DNA levels were estimated by a PCR method using primers for the human β-haemoglobin gene. Patients were followed up to 3 months. In addition, plasma DNA concentrations were found to be significantly different between patients who developed a sepsis state and those who did not (septic patients, median = 192.1 ng/ml, IQR = 298; nonseptic patients, median = 73.8 ng/ml, IQR = 110.6, P = 0.03). Receiver operator characteristic (ROC) curves were calculated for the use of plasma DNA as a predictor of death and of sepsis (see Table 1 ). The results presented here demonstrate that plasma DNA may be a useful prognostic marker of mortality and sepsis in critically ill patients. Further research is clearly needed in the use of this novel marker in the intensive care setting and into the possible mechanisms of release/clearance of plasma DNA in disease states. Available online http://ccforum.com/supplements/8/S1 Setting The ICU of a surgical department. Methods Twenty-six multiple trauma patients were enrolled in the study, nine of whom developed infections during their hospital stay. Inflammation parameters were measured daily until discharge from the ICU. PMN migration was determined in fresh, whole blood with a novel ready-for-use membrane filter assay. The percentage of PMNs migrating from the blood into a filter upon F-Met-Leu-Phe stimulation was relevant. The other parameters were measured with conventional methods. (commercially available methods). Statistics For each parameter, cutoffs between the groups of infected patients before occurrence of infection and noninfected patients were determined with receiver-operator characteristic curves, and patients were assigned to one of two alternatives with different specifications: did values occur that were beyond the critical cutoffs at least on 2, 3, or 4 days; or at least on 2, 3, or 4 days in sequence, and vice versa: did such values not occur in a patient? Contingency tables were set up for the yes-no decisions for each specification in the infected and noninfected groups, and sensitivity/specificity were calculated. Significance threshold was P < 0.05 (Fisher's exact test). The specifications with the highest significance were taken as relevant. PMN migration below a cutoff of 6% at least 2 days in sequence occurred before infection in eight of the nine infected patients, but only in three of the 17 noninfected patients (i.e. a sensitivity of 88% and a specificity of 82%, P = 0.0008). Fever ≥ 38°C for 3 days in sequence showed a specificity of 94%, but a sensitivity of only 55% (P = 0.009). The other parameters had no significant discriminative power. Conclusions Among a variety of related parameters, PMN migration proved to be a sensitive predictive marker for infections. Impaired PMN migration indicates impending infections. Early recognition of an infection risk may help to initiate aggressive antimicrobial therapy before clinical manifestation of infection, thus improving therapeutic success. We prospectively studied 23 patients that suffered SIRS after their admission in a 23-bed general ICU. Patients were divided in two study groups according to the time of the onset of SIRS. Group A: eight patients (APACHE II score 19 ± 6, age 56 ± 18 years) who had SIRS within 7 days after admission; and group B:14 patients (APACHE II score 17 ± 8, age 51 ± 19 years) who had SIRS after the first week of stay in ICU. Clinical and laboratory measurements recorded for three consecutive days included temperature (T), white blood count (WBC), erythrocyte sedimentation rate (ESR), C-reactive protein (CRP) and PCT. Infection was confirmed by positive blood, bronchoalveolar lavage, urine or other body fluid cultures. To evaluate differences between studied groups, a t test was used. Cutoff values for PCT were determined using Youden's index. The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of PCT were also estimated. Results Infection was confirmed in four (50%) patients in group A and in 11 (79%) patients in group B. the serum concentration of CRP, WBC, ESR and T was similar in both groups, as well as between septic and nonseptic patients within each group. The PCT value in group A did not differ significantly in comparison with group B (7.1 ng/ml vs 0.5 ± 0.3 ng/ml, P = 0.06). However, septic patients in group A had significantly increased PCT values compared with septic patients in group B (9.48 ± 11 ng/ml vs 0.49 ± 0.3 ng/ml, P < 0.05). The best cutoff value, sensitivity, specificity, PPV and NPV for PCT in groups A and B are presented in Table 1 . Methods Sixty high-risk patients (mean age 51 ± 11 years) with two or more risk factors (severe initial condition, proinflammatory diseases in anamnesis, repeated operations, expected cardiopulmonary bypass [CPB] time > 2 hours, age > 60 years, preceding hospitalization and antibacterial therapy) were enrolled into the study. Besides routine clinical investigations, which were performed daily, blood plasma samples were collected before and at 1, 2, 3 and 6 days after surgery. PCT values were determined using the immunoluminometric assay (LUMItest ® PCT, BRAHMS Aktiengesellschaft, Germany). All patients were divided into two groups: Group A without and Group B with postoperative infectious complications. All data were compared by t test and P < 0.05 was considered statistically significant. Results There was no difference in age, CPB and aorta crossclamping time between the groups. Initial PCT concentrations did not exceed normal values (< 0.5 ng/ml). Postoperatively, 14/60 (23.3%) patients developed infectious complications (10 pneumonia, one surgical site infection [SSI], one pneumonia and SSI, one sepsis, one pneumonia and sepsis). The ICU and hospital stay after surgery were significantly longer in Group B compared with Group A (5.1 and 23.9 days vs 2.2 and 17.1 days, respectively, P < 0.05). PCT levels were significantly higher in patients with infection during all the observation period, while the difference between the groups in APACHE II score appeared only from the second day postoperatively. The data are presented in Table 1 . Conclusions PCT levels are higher and remain increased for longer in patients with postoperative infection. PCT monitoring is useful for early diagnostics and prediction of severe postoperatice infections in cardiac surgery. Methods A small molecular weight of CD14 was discovered by a new type of soluble CD14 ELISA, which was developed with two types of anti-CD14 antibodies and evaluated using serum samples of patient with sepsis, SIRS, other diseases and healthy control. At the same time, the determination of the C-reactive protein, endotoxin, soluble CD14 (IBL-Hamburg), procalcitonin (PCT) (BRAHMS) and SOFA scoring were also carried. The new assay detected the small molecular weight of CD14 specifically, but not the 49 kDa and 55 kDa soluble CD14 in serum. The concentration of normal controls (71 specimens) was 27.4 ± 14.4 (mean ± SD) ng/ml and the cutoff level was established at 57 ng/ml (mean ± 2SD). The ELISA detected 94.5% (52/55) of sepsis and 16.3% (13/80) of SIRS. The mean concentration of sepsis and SIRS was 248.2 ng/ml and 41.4 ng/ml, respectively. Sequential study of sCD14-ST was a significant association with the SOFA score and serum endotoxin level. Serum sCD14-ST increased in the very early phase of infection, and was detected much faster and easier than the blood culture test. The new assay was not correlated to soluble CD14 (IBL-Hamburg) and PCT (BRAHMS). Conclusion This ELISA can detect specificallyu for sepsis during the early phase of infection and is useful for monitoring the severity of sepsis. The small molecular sCD14-ST is a new marker for diagnosing sepsis. Available online http://ccforum.com/supplements/8/S1 Methods Subjects in the present study were 39 patients with SIRS admitted to the emergency unit. Plasma levels of SE were determined by ELISA at admission and 1, 3, 5, and 7 days after the admission. The normal range of SE was 3.17-32.09 ng/ml. The presence of various organ failures was diagnosed when patients had the SOFA score higher than 3 points in each organ dysfunction scoring system. A two-sided Fisher's exact test was used to analyze the difference in the incidence of organ failure and the mortality. P < 0.05 indicated the statistical significance. Results and discussion SOFA scores in the first 8 days in the emergency unit were significantly higher in patients with elevated SE levels at admission (DAE group, n = 15) than those with normal SE levels at admission (DAN group, n = 24). The incidence of respiratory failure in that period was significantly higher in the DAE group (73.3%) than in the DAN group (16.7%) (P < 0.01), and that of renal failure was also significantly higher in the DAE group (33.3%) than in the DAN group (0%)(P < 0.01). The mortality at 28 days after admission in the DAE group (20.0%) was significantly higher than that in th eDAN group (0%) (P < 0.05). These observations suggested that determination of plasma levels of SE might be useful for prediction of the development of organ failures, especially that of respiratory and renal failure, and the outcome of such patients with SIRS might have the potential risk for development of multiorgan failure. Metabolic changes in different organs in the critical state are often subtle and remain unrecognized with conventional hemodynamic and metabolic parameters. Since the development of the microdialysis technique, which mimics the passive function of a capillary blood vessel, it is now possible to examine the interstitial space of intact tissue. Preconditioning an animal with low doses of endotoxin has been shown to induce endotoxin hyperresponsiveness, the so-called 'endotoxin tolerance'. The exact pathophysiological mechanism involved has still not been fully elucidated. It appears to represent an essential mechanism to prevent unreasonable expansion of inflammatory or indiscriminate immune and metabolic responses before specific antibodies are developed. Up to this study, no data were available about the effect of endotoxaemia with and without endotoxin preconditioning regarding regional metabolic monitoring by the microdialysis technique. Materials and methods Ten pigs (25 ± 8 kg) were randomly assigned to a control group (n = 5) and a pretreated group (n = 5) before endotoxaemia was induced. Pretreatment consisted of injecting incremental doses of endotoxin (SAE) on days 5-2 before the experiment ranging from 5 ng/kg on day 5, 10 ng/kg on day 4, 30 ng/kg on day 3 to 50 ng/kg on day 2. The control group received saline injections at the same time intervals. The day prior to the experiment, neither group received any injection. The main findings of this study describe various metabolic consequences of endotoxin preconditioning prior to lethal endotoxin shock, but also the effects of endotoxaemia per se on local metabolism in three tissue compartments. This was only possible by using sequential microdialysis technique. Molecules in the tissue (extracellular fluid) surrounding the microdialysis capillary diffuse through the semipermeable part of the capillary to the dialysate, which is pumped through the capillary. Changes in the concentrations of a substrate in the surrounding milieu are reflected by the concentration in the dialysate. Thereby, sampling of the dialysate at different time intervals, and subseqeuent analysis of the concentration, gives an idea of a substrate's concentration in the extracellular space over the observation time. The role of cellular metabolism during sepsis is a highly controversial issue. While some studies suggest that tissue blood flow is reduced during endotoxaemia (leading to cellular oxygen debt and, consequently, to anaerobic metabolism), other studies suggest the septic insult to be a cellular dysoxia. Our data clearly confirm that endotoxaemia is accompanied by an increase in metabolism in various tissues and in systemic oxygen consumption. Objective Not much is currently known about lipoproteins and its role in septic shock.Many studies have recently implicated lipoproteins with the innate immunity against lipopolysacchardie. There are many questions of whether low values of serum lipoproteins are related to an impaired innate immunity against endotoxin and to a poor prognosis. We conducted a prospective study to analyze serum lipids, glucose, triglycerides and C-reactive protein in septic shock patients and to evaluate its possible relation with outcome. Design A prospective observational analysis of serum of patients meeting the criteria for septic shock. Setting A 28-bed medico-surgical ICU in a university hospital. Patients Eighteen patients were analyzed in the study. We selected all consecutive patients who met the criteria for septic shock in our ICU and we collected blood samples for analysis on days 1, 3, 6, 9, and 12 or until death.We analyzed total cholesterol, cholesterol fractions (high density lipoprotein [HDL], low density lipoprotein, very low density lipoprotein), triglycerides, glycemia and C-reactive protein (as a marker of inflamation). All results are presented as the mean with standard deviation. For analysis we divided patients into survivors and nonsurvivors at day 12. We performed a paired Student's t test for differences in continuous variables, and correlation coefficients were determined according to multiple-level regression analysis. P < 0.05 was considered significant. Our mortality rate was 60%. We had 10 patients in the nonsurvivor group and eight patients as survivors. The two groups had similar APACHE II scores (nonsurvivors 26 ± 6; survivors 24 ± 5; NS). At day 1 there were no statistical differences for any of the substances analyzed. From day 3 onwards, we achieved significant statistical differences between survivors and nonsurvivors for total cholestherol, HDL fraction, trigycerides, glycemia and C-reactive protein. The correlations of C-reactive protein with the HDL fraction, total cholesterol, triglycerides and glycemia were not good. As independent variables, we found only glycemia and trigycerides. Conclusions In our patients, hypocholesterolemia, low levels of HDL fraction, hypertriglyceridemia and hyperglicemia were statistically significant related to a poor prognosis. C-reactive protein did not show a good correlation with other parameters. The objective of the present study was to investigate the pathophysiological role of iNOS induction in renal proximal tubules during experimental endotoxemia in humans. A bolus injection of 2 ng/kg lipopolysaccharide of Escheria Coli (0113) was administered to eight healthy volunteers (four male/four female, age 24 ± 3 years), and 11 volunteers (five male/six female, age 22 ± 2 years) received the vehicle only (controls). At different time points, arterial blood in urine was collected and analyzed for nitrates/nitrites, the stable metabolites of NO. The amount of iNOS mRNA was determined by quantitative real-time RT-PCR from isolated cells of the urine. The urinary excretion of both glutathione-S-transferase-α (GST-α, present only in proximal tubular cells) and glutathione-S-transferase-π (GST-π, confined to distal tubular cells) were measured as well. Administration of endotoxin resulted in the expected increase of proinflammatory cytokines (TNF-α from < 0.015 to 856 ± 158 pg/ml, P = 0.002), accompanied by fever (maximum temperature 38.7 ± 0.3°C, P < 0.0001), flu-like symptoms and cardiovascular changes (heart rate from 63 ± 3 bpm at baseline to 91 ± 3 bpm at t = 5 hours, P < 0.0001; and mean arterial pressure from 96 ± 3 mmHg to 79 ± 4 mmHg, P < 0.0001). All changes were significantly different from the control group. Blood samples did not show a significant change in NO metabolites following the administration of endotoxin. Twelve hours after endotoxin administration, the urinary level of NO metabolites doubled (P < 0.05), whereas no significant change was observed in the control group. This effect was associated with an increase in iNOS mRNA measured 24 hours after endotoxin administration. Urinary excretion of GST-α increased from 0.7 ± 0.3 µmol/hour to 3.1 ± 1.3 µmol/hour at t = 12 hours (P < 0.01), whereas GST-π remained unchanged (from 0.2 ± 0.04 to 0.4 ± 0.13 µmol/hour). In controls, no difference in both GST-α and GST-π excretion was observed. Our results indicate that systemic inflammation induced by endotoxin in humans results in upregulation of renal iNOS, associated with proximal tubule injury. Background and aim Moderate hypothermia during CPB inhibits intramyocardial TNF-α synthesis. Since the expression of TNF-α and other inflammatory mediators such as cyclooxygenase-2 (COX-2) and inducible nitric oxide synthase (iNOS) is regulated by MAP kinases (p38 and ERK1/2 MAPK), and by the NF-κB pathway, this study was intended to analyze the effect of moderate hypothermia during CPB on these signaling pathways in the myocardium. Methods Twelve young pigs were assigned to a temperature (T°) regimen during standarized CPB: normothermia (T°= 37°C; n = 6) or moderate hypothermia (T°= 28°C; n = 6). Myocardial probes were sequentially sampled from the right ventricle before, during and 6 hours after CPB. Myocardial expression of TNF-α, COX-2, and iNOS were detected and quantified by competitive RT-PCR and/or western blot. Phosphorylation of p38 MAP and ERK1/2 MAP kinases, and of IκB was detected by western blot. Intracellular localisation of TNF-α, COX-2, iNOS, and NF-κB p65 and p50 were ensured by immunohistochemical staining. Both TNF-α mRNA and protein levels were detected as soon as 30 min after initiation of CPB and before aortic clamping in both groups, but were lower in pigs that were on hypothermia than in the others (P < 0.05, respectively). This difference persisted during and after CPB. The levels of iNOS and COX-2 were detected during and after CPB but without any difference between groups. Phosphorylation of p38 MAP and ERK1/2 MAP kinases and of IκB was detected before, during and after CPB. Levels of phospho-p38 MAP kinase but not of ERK1/2 MAP kinase and IκB tended to be lower in animals on hypothermia than in the others (P < 0.1). Conclusion This study shows for the first time that cardiac surgery induces the expression of TNF-α in the myocardium as soon as 30 min after institution of CPB, before aortic clamping. This is associated with the activation of p38 MAP and ERK1/2 MAP kinases and NF-κB pathway. The inhibition of TNF-α expression by hypothermia is related to the inhibition of p38 MAP kinase. Corp., Tokyo, Japan), an elastase inhibitor extracted from human urine, has been used to treat for patients with acute pancreatitis or acute circulatory failure since more than 15 years ago in Japan. Objective The current study was performed in order to evaluate the efficacy of UTI for vascular endothelial disorders in critical illness. Thirteen severe patients, who had an APACHE II score above 20 at transportation to our emegency room, were elected and randomly assigned to either a treated group or a control group. The number of patients in the treated group was five and eight patients were in the untreated group. Maximum dose of UTI (30,000 U/ml) were administrated to patients in the treated group at the emergency room. After admission in the ICU, the neutrophil elastase level, IL-6 level, thrombomodulin (TM) level, protein C level and UTI level in plasma were examined daily until the 7th day after hospitalization. Values are expressed as mean ± SD. An unpaired Student's t test was used and P < 0.05 was considered statistically significant. In the untreated group, the UTI level in plasma at the 7th day after hospitalization was siginificantly higher than that in the treated group (96.9 ± 44.4 vs 15.6 ± 6.5 U/ml, P < 0.05). Also, the TM level was also significantly different between the untreated group and the treated group (6.0 ± 2.4 vs 3.6 ± 1.4 FU/l, P < 0.05) at the 7th day. Other examination values showed no significant differences between the two groups during their clinical courses. The half-time of UTI is very short (about 40 min) and the plasma level linear decreases until 3 hours after its administration. The difference of UTI level on the 7th day was considered as spontaneous UTI. And vascular endothelial disorders were recognized in the higer UTI level group at the same day, but this group was the untreated group. This study proposes it i important for high-dose administration of UTI at a very acute phase in critical illness to inhibit the vascular endothelial disorders. We investigated the effect of urinary trypsin inhibitor (UTI) on the output of plasma mean neutrophil elastase (PMNE) and cytokine. The participants of this study are 15 trauma patients with shock on arrival at our insutitution. We divide them in two groups: one group is administrated 300,000 U (intravenous) UTI, and the other is the control group. At 0, 1, 3, 5, and 7 days after hospitalization of each group, we measured the serum levels of IL-6, IL-8, UTI, PMNE, thrombomodulin, protein C, and the SIRS score and SOFA score as an index of severity. The PMNE levels in the UTI-administered group were observed to be significantly lower than those in the control group at 1 and 3 days after hospitalization. These data suggest that. UTI can suppress vascular endothelial cell disorders by inhibiting the production of PMNE and by direct inhibition of PMNE. Background Exosomes are small vesicles (50-100 nm) released from cells after activation. We have previously shown that exosomes derived from platelets of septic patients induce apoptosis of endothelial and vascular smooth muscle cells through a redox-dependent pathway. Since reactive oxygen species may be involved in myocardial dysfunction of sepsis, the aim of this study was to investigate a possible role of exosomes in sepsis-induced myocardial dysfunction. Methods Exosomes were separated by filtration and ultracentrifugation of plasma from 26 septic shock patients (SSP) and from 10 healthy volunteers. After separation, exosomes were infused at a 0.5-fold plasma concentration in Langendorff-perfused hearts from 22 New Zealand rabbits. In other experiments, 12 isolated rat papillary muscles were exposed to a 0.5-fold plasma concentration of exosomes from septic patients and healthy controls. Results Incubation of exosomes from SSP for 20 min induced a 17% and 30% reduction in positive and negative time-pressure derivatives, respectively (n = 7, P < 0.05). Incubation of exosomes from healthy volunteers for the same time induced a nonsignificant 9% and 19% decrease in positive and negative time-pressure derivatives, respectively (n = 5, P = NS). This effect was 60% and 83% inhibited by L-NMMA (10 -4 M). Exposure of rat papillary muscles to exosomes from SSP caused decrease in developed tension (3.22 ± 0.77 g/mm 2 pre vs 2.86 ± 0.80 g/mm 2 post, n = 8, P = 0.007) and in positive time-tension derivative (31.8 ± 9.32 g/mm 2 /s pre vs 28.06 ± 9.31 g/mm 2 /s post, n = 8, P = 0.01). Incubation with exosomes from healthy volunteers induced no significant difference in developed tension (3.08 ± 0.61 g/mm 2 pre vs 2.95 ± 0.72 g/mm 2 post, n = 4, P = NS) and in positive time-tension derivative (34.18 ± 6.03 g/mm 2 /s pre vs 34.45 ± 6.75 g/mm 2 /s post, n = 4, P = NS). Conclusion Infusion of exosomes from platelets of septic patients induces myocardial dysfunction in isolated rabbit hearts and in rat papillary muscle preparations. The attenuation of dysfunction after exposure to nitric oxide (NO) inhibitor L-NMMA indicates a possible role for NO in exosome-induced myocardial depression. Available online http://ccforum.com/supplements/8/S1 Sepsis is a syndrome, a consequence of an overwhelming systemic response to infection. It rapidly leads to organ dysfunction, which accompanies systemic inflammation and hematological abnormalities. In Gram-negative bacteria, lipopolysaccharide (LPS) has a dominant role for the activation of various cells, which secrete several proinflammatory cytokines including TNF-α. These cytokines are also important in inducing a procoagulant effect in sepsis. On the other hand, toll-like receptor 4 (TLR4) is responsible for the LPS signaling and causes the activation of NF-κB. It is clear that an inappropriate TLR response to bacterial infection could have critical consequences. Therefore, TLR4 signal inhibitor is attractive as a therapeutic medication for sepsis. The benzisothiazole derivative (M62812) was found as a TLR4 signal transduction inhibitor using the reporter gene assay system. It also suppressed LPS-induced upregulation of proinflammatory cytokines, adhesion molecules and procoagulant activity in human vascular endothelial cells and/or peripheral mononuclear cells. The half-maximal inhibitory concentrations (IC 50 ) of this compound in these assays were ranging from 1 to 3 µg/ml. Single intravenous administration of M62812 (10 mg/kg) protected mice from death in murine D-galactosamine-sensitized endotoxin shock model (Table 1 ). In this model, M62812 (20 mg/kg) completely improved the elevation of inflammatory cytokines and the hematological abnormalities. Furthermore, M62812 (20 mg/kg) also prevented mice from the lethal shock in murine cecal ligation and puncture model. LPS is a potent and predominant microbial mediator that induces an intense inflammation and procoagulant response, which are closely correlated events. M62812 inhibits those responses in vitro and prevents lethal shock in animal models. Therefore, we concluded that it will become a novel therapeutic medication for sepsis and septic shock by inhibiting TLR4 signal transduction and improving endothelial cell and leukocyte cell functions. The purpose of the present study was to evaluate the effects of continuously infused N-acetyl cysteine (NAC) on serum cytokine levels and gastric intramucosal pH in humans suffering from severe sepsis. Fifty-three patients were included to the study. After applying 150 mg/kg NAC (n = 27, NAC group) bolus intravenously for 5 min, it was continued with 12.5 mg/kg/hour intravenously for 6 hours as an infusion. The control group (n = 26, control group) was administered the same dose of dextrose 5% solution. Hemodynamic parameters (heart rate, mean arterial pressure), nasopharyngeal body temperature, arterial blood gas changes (pH, PO 2 , PCO 2 ), plasma cytokin levels (IL-1β, IL2-R, IL-6, IL-8, tumor necrosis factor-α), biochemical parameters (hematocrit, leucocyte, thrombocyte, urea, creatinin, total billirubin, direct billirubin, total protein, albumin, serum glutamate oxalate transaminase, serum glutamate piruvate transaminase, sodium, potassium) and intramucosal pH, staying time in the intensive care unit, time of mechanical ventilation support, and mortality, with the control group, were recorded. All measurements were obtained at baseline (15 min before start of the study) and were repeated immediately after, and at 24 and 48 hours after NAC infusion. No differences were found according to these parameters. We found that the effect of NAC infusion did not affect cytokine levels or patients' outcome or gastric intramucosal pH in severe sepsis in humans. Because of the limited number of patients in our study and the short period of observation, our findings need to be confirmed by larger clinical trials of NAC infused in a dose-titrated manner. However, our results do not support the use of NAC in patients with severe sepsis. The MPO activity, MDA and 3-NT levels in lung homogenates were found to be increased in the OA group, and the administration of NAC significantly reduced tissue MPO, MDA and 3-NT levels (P = 0.0001). Lung histopathology was also protected by NAC in this OA-induced experimental lung injury model. In conclusion, the present study demonstrates that oleic acid induces myeloperoxydase activation and consequently increases 3-NT and MDA levels in lung tissue. Our data suggest that elevated 3-NT levels in lung tissue represent the role of excessive formation of peroxynitrite and the efficacy of NAC treatment in the prevention of peroxynitrite-mediated OA-induced lung injury. Due to its antioxidant and antiinflammatory properties, NAC seems to be a promising agent in treatment of critically ill patients with lung injury states. Aim This study investigated whether therapeutic administration of a novel antihuman interferon-gamma (anti-IFNγ) monoclonal antibody (mAb) could improve outcome in a lethal model of Gramnegative bacteremic shock. Methods Gram-negative bacteremic shock was induced in 14 anesthetized Cynomolgus monkeys by intravenous injection of approximately 10 (10) cfu live Escherichia coli. Treatment was administered only after the animals developed symptoms of shock meeting at least two of the predefined criteria: 30% reduction in blood pressure, 30% increase in heart rate and oliguria (urine output < 1 ml/kg body weight/hour). Six of the animals received placebo while eight were treated with 10 mg/kg humanized anti-IFNγ mAb. Invasive hemodynamic monitoring under anesthesia was continued for 12 hours, after which the animals were returned to their cages, and were followed daily for clinical signs during 14 days. Five out of the six placebo-treated Cynomolgus monkeys died or required euthanasia within 24-72 hours after E. coli challenge, while one animal survived for 5 days. In contrast, six of the eight animals treated with the humanized anti-IFNγ mAb survived for 7-14 days (P = 0.013 vs placebo). More specifically within the treated group, two animals died early of sepsis (day 3 and day 4, respectively), two animals were euthanized on day 7 because of limb necrosis (caused by catheter-related thrombosis) and not directly because of the sepsis symptoms, one animal was euthanized on day 9 due to sepsis symptoms, and three animals survived 14 days and appeared to be in good health. Treatment with the anti-IFNγ mAb decreased the systemic TNF-α, IL-6 and IL-1β response to E. coli. Furthermore, renal dysfunction, evidenced by increased creatinine, was significantly decreased by treatment with the anti-IFNγ mAb. Conclusions This study demonstrates that, in a primate model of E. coli-induced septic shock, the neutralization of IFNγ with a mAb, administered after the onset of clinical signs of sepsis, improves survival and attenuates the pathological changes associated with the development of multiple organ dysfunction. This suggests that IFNγ blockade potentially represents an effective mode of intervention in lethal septic shock. Methods We simulated an interventional trial of a neutralizing body against tissue necrosis factor (anti-TNF) in sepsis based on a mechanistic mathematical model that includes a bacterial infection, the host response, and a therapeutic intervention. Simulated cases differed by bacterial load and virulence, as well as individual propensity to mount and modulate an inflammatory response. We submitted 1000 cases to three doses and durations of anti-TNF, and present the results of the simulation. To evaluate the usefulness of modeling to improve patient selection, we constructed a logistic model with a four-valued outcome: (1) helped by treatment, (2) survives irrespective of treatment, (3) dies irrespective of treatment, and (4) harmed by treatment. Independent predictors were 'measured' at the time of disease detection and 60 min later, and included serum TNF, IL-10, IL-6, activated protein C, thrombin, tissue factor, blood pressure and cell counts. All results from the statistical model are reported in a validation cohort. Control survival was 62.9% at 1 week. Depending on the dose and duration of treatment, survival ranged from 57.1% to 80.8%. Higher doses of anti-TNF, although effective, also resulted in considerable harm (Fig. 1) Conclusions Our models points out how to improve patient selection and how therapy could be individualized. Available online http://ccforum.com/supplements/8/S1 Impaired host defense mechanisms following surgical stress such as burn, major surgery and polytrauma are considered important for the development of infectious complications and sepsis. We tested whether interferon (IFN) gamma can improve monocytic and lymphocytic functions and can reduce deaths related to sepsis. In order to restore their antimicrobial defense capacity, recombinant human IFN-gamma, 100 µg, was administered intravenously once daily to six immunoparalyzed patients. Informed consent was obtained from patients in all cases, and the study received local hospital ethical committee approval. Intracytoplasmic Th1 and Th2 cytokine production in isolated peripheral blood mononuclear cells was assessed by flow cytometry following in vitro activation by phorbol myristate acetate plus ionomycin. Monocytic human leukocyte antigen-DR (HLA-DR) expression was also measured. Immunoparalysis was defined as a decreased level of HLA-DR expression of monocytes < 30% or a decreased level of Th1 < 10%. IFN-gamma applied to immunoparalytic patients with low monocytic HLA-DR expression restored the deficient HLA-DR expression and in vitro Th1 cytokine production completely within 3 days. Recovery from immunoparalysis resulted in clearance of sepsis in four of six patients. IFN-gamma administration in septic patients accompanied with immunoparalysis is a new therapeutic strategy. years, P < 0.05) and had a higher SAPS II score (26 ± 10 vs 18 ± 9, P < 0.01). The delay of admission in the critical care unit (CCU) was higher in the pneumonia group (0.3 ± 0.7 days vs 3.2 ± 4.0 days, P < 0.01). The delay of handling by a physiotherapist was also higher (3.7 ± 2.0 days vs 8.1 ± 8.0 days, P < 0.01). Pneumonia was associated with a longer period of mechanical ventilation (10 ± 11 days vs 21 ± 13 days, P < 0.05) and a longer stay in the CCU (8.6 ± 9.5 vs 39.2 ± 29.3, P < 0.0001). Univariate analysis of qualitative factors is presented in Table 1 . Early mechanical ventilation, upper cervical spine involvment and inhospital mortality were not related to pneumonia. Overall mortality is higher in the pneumonia group. Our study suggests that an early and aggressive handling of patients with cervical spine injury, particularly with complete motor deficit, could be beneficial to decrease pneumonia incidence and its associated adverse events. In the past few years the clinical pulmonary infection score (CPIS) has been purposed as a diagnostic tool in ventilator-associated pneumonia (VAP). The CPIS incorporates five variables: temperature (T°C), white blood cells (WBC), tracheal secretions, pO 2 /FiO 2 and RX. The range of CPIS is from 0 to 12 and its positive predictive value (PPV) has been found > 90% at a cutoff point ≥ 6. Recently, the CPIS was purposed in order to monitor the course of VAP and the efficacy of therapy. Objective To evaluate the CPIS in diagnosis of and monitoring the course of illness in children with VAP. Design A retrospective observational study. Setting Pediatric ICU in a national children hospital. Data collection CPIS was collected 48 hours before diagnosis of VAP, at diagnosis, and daily for the first week. In all patients were evaluated the mean PAW, positive end expiratory pressure (PEEP), PO 2 /FiO 2 , RX score, PCR, WBC, T°C and number of bronchial suctionings/day. We have also evaluated the influence of each parameter of CPIS on the final score. Data were analysed with the Student t test (P < 0.05*), chi square (P < 0.05**) and univariate analysis (correlation coefficient R 2 ***). In our population the incidence of VAP was 37% (15/40 patients) with mortality 27% (four patients). The mean CPIS value at the time of diagnosis was statistically higher than 48 and 24 hours before in all patients (4.9 ± 0.8 vs 7.5 ± 1.5, P < 0.05); in 14/15 patients it was ≥ 6 (PPV 93%). At day 3 after diagnosis the population was divided into two groups: patients with CPIS ≥ 6 (group 1, n = 5) and patients with CPIS < 6 (group 2, n = 10). At this time between the two groups were reported statistically significant differences in PEEP (10.5 ± 1 vs 5.6 ± 2.7, P < 0.05*), PAW max (27.5 ± 6.6 vs 14.7 ± 8.6, P < 0.05*), pO 2 /FiO 2 (145 ± 10 vs 282 ± 20, P < 0.05*), PCR reduction > 30% (20% vs 50% of patients, P < 0.05**) and microbiological positivity (80% vs 20%, P < 0.05**). pO 2 /FiO 2 increases significantly in group 2 (P < 0.05*) and univariate analysis revealed that only pO 2 /FiO 2 is related to the CPIS score (R 2 *** 0.77***); WBC, T°C, number of suctioning/day and RX score are not related. All patients with CPIS < 6 survived whereas 80% of patients with persistently high CPIS after 72 hours from diagnosis of VAP died. The analysis of the MSOFA score revealed significant differences in circulatory (P < 0.05*) and respiratory (P < 0.05*) scores between those who died and survivors at day 3 and later after diagnosis of VAP. Conclusions CPIS had an elevated PPV in diagnosis of VAP (93%), and is an early predictor of poor outcome in patients with VAP and allows a good monitoring of the course of illness. Aim Ventilator-associated pneumonia (VAP) is a common cause of morbidity in ICU patients. There is still no evidence that invasive bronchoscopic techniques should form part of a routine approach to suspected VAP [1] . In our study, we compared specimen results in terms of sensitivity and specificity that were taken with bronchoalveolar lavage (BAL) fluid and endotracheal aspirates (ETA) and we examined the value of clinical pulmonary infection score (CPIS) in the diagnosis of VAP. The study included 50 adult patients in the ICU, who received mechanical ventilation for more than 48 hours, did not receive antibiotherapy and in whom VAP was clinically suspected. Patients who have contraindications for bronchoscopy were excluded. The VAP was defined by clinical criteria as follows: presence of a new or progressive lung infiltration on chest X-ray, plus at least two of the following clinical criteria; fever 380°C or 360°C, white blood cell count > 10,000 mm 3 or < 5000 mm 3 , purulent ETA. Microbiologic sampling procedures were performed in patients using both distal tracheal suction and BAL. CPIS was calculated with a threshold value of 6, enabling identification of VAP patients. We determined BAL ≥ 10 4 colony-forming units (cfu)/ml in all patients with ETA ≥ 10 5 cfu/ml and exact conformity was achieved. We determined BAL ≥ 10 4 cfu/ml, ETA were ≥ 10 4 to < 10 5 cfu/ml in 18 patients. Although ETA results were below the threshold level, they showed conformity with BAL results in 18 patients. We accepted BAL ≥ 10 4 cfu/ml as a reference value of CPIS ≥ 6, and we determined sensitivity of 83% and positive predictive value of 50% during the diagnosis. When we accepted CPIS ≥ 6 as a reference value, the possibility of inaccurancy of diagnosis is high in every one of the two patients with VAP. Nearly one patient was missed in every five patients with VAP. Conclusions BAL results did not provide superiority in the diagnosis of VAP when compared with the ETA. A threshold level ≥ 10 4 cfu/ml may be reduced for ETA and evaluation of CPIS alone is not suitable for diagnosis or exclusion of VAP. Methods A prospective study during 9 months. Included were all patients who required mechanical ventilation for 12 hours or more. At admission to the ICU patients were randomized ito two groups: one group was suctioned with CESS, and another group with OESS. VAP was defined according to the following criteria: a chest radiographic examination showing new or progressive infiltrate; new onset of purulent sputum; significant quantitative culture of pathogen from respiratory secretions (tracheal aspirate > 10 6 cfu/ml, bronchoalveolar lavage > 10 4 cfu/ml or protected bronchial brush catheter > 10 3 cfu/ml). The statistical analysis was realized by the chi-square test and the Student t test, and we took P < 0.05 to consider a significant difference. Conclusions We conclude that in our series, the CTSS did not reduce the ventilator-associated pneumonia incidence, nor the exogenous pneumonia. Background CPIS is an accepted tool for clinical estimation of ventilator-associated pneumonia (VAP), encompassing five components: temperature, blood leukocytes, tracheal secretions, oxygenation index and chest roentgenogram. VAP is considered a frequent complication in mechanically ventilated severe polytrauma patients. There are some current issues that highlight the influence of VAP development on patient mortality [1] . The aim of this study was to evaluate VAP development in severe polytrauma patients according to CPIS criteria and on that basis to determine the patients' clinical course. Design A prospective observational study. Setting Seventeen-bed trauma intensive care unit. Methods Thirty-one severe polytrauma patients (ISS > 26 at admission) requiring mechanical ventilation for more than 3 days who developed VAP were enrolled. We recorded the length of stay of all patients and examined CPIS criteria daily for a period of 10 days after pneumonia occurrence. Patients were divided into subpopulations according to the severity of post-traumatic complications. We calculated the mean 10-day CPIS value for each patient. We considered CPIS > 5 points to be indicative for VAP. The period that VAP occurred was between the 5th and 8th ventilator day. We observed that the clinical course of VAP depended on the severity of the post-traumatic complications. Patients with no post-traumatic complications had a mean 10-day CPIS value of 4.5. Patients with severe complications had a mean 10-day CPIS value of 5.3, 7.1, 8.1 and 8.4 for patients with single organ failure (survival rate 100%), for patients with pulmonary contusion (survival rate 65%), for patients with severe sepsis (survival rate 26%) and for patients with multiple organ failure syndrome (> 2 organs involved, survival rate 17%), respectively. The VAP resolution was shorter in patients with an adequate initial antibiotic therapy. Conclusion These data suggest that in the length of 10 days, we could have a precise estimation for the severity of VAP and its dynamics. Mean CPIS values showed good correlation with the degree of post-traumatic complications. CPIC dynamics could serve as a good clinical tool for determination of whether the patient would recover or we might expect a bad outcome. Curr Opin Crit Care 2003, 9:397-402. , OW 17 ± 23, P = 0.06), days of MV (ER 8 ± 10, OW 11 ± 16, P = 0.12) and mortality (ER 50%, OW 55%). VAP incidence was 26% (36/137) in the ER group and 21% (16/77) in the OW group (P = 0.36). VAP incidence according to 1000 days of MV was 29 × 10 3 for the complete set, 30 × 10 3 in the ER group and 19 × 10 3 in the OW group (P = 0.07). VAP diagnosis was done by means of fiberoptic bronchioalveolar lavage (BAL) in 50%, nonfiberoptic BAL in 19% and tracheal aspiration in 31%. The early VAP incidence was 27% in the ER group and 25% in the OW group (P = 0.83). There were significant differences in the isolated germs in these groups (100% multiresistant germs in the OW group, P = 0.02). In the late VAP there were also differences due to Candida spp. isolation in 22% of the cultures (P = 0.03). The initiation of MV in a setting different than the ICU is not related with a higher incidence of VAP. Patients coming from medical or surgical wards have a higher prevalence of multiresistant germs in early VAP and of Candida in late VAP. There are no reproducible parameters to predict the duration of ventilation and length of stay for RSV-induced respiratory failure in previously healthy children. Tasker [1] found an alveolar-arterial gradient (AaDO 2 ) > 400 with mean airway pressure (MAP) > 10 in the first 24 hours, and an AaDO 2 > 300 with MAP > 10 over the subsequent 24 hours identified RSVpositive cases at risk of a prolonged stay. Another study found an AaDO 2 > 253 in the first 24 hours to be the best predictor of developing severe disease with the need for a prolonged stay. Methods Data were collected from 118 cases of respiratory failure in a regional pediatric ICU over four RSV seasons. Forty-six were excluded with an alternative diagnosis; of the remaining 72 cases of bronchiolitis, 52 were RSV-positive (six not ventilated), and 20 were RSV-negative. Of the 46 ventilated RSV-positive, three had congenital heart disease, four had chronic lung disease, and one had been treated in a different unit prior to admission. Of the 20 RSV-negative cases, three had congenital heart and/or chronic lung disease. Both groups were assessed against Tasker's criteria and a multiple regression model used to identify predictive markers with length of ventilation as the dependent variable. We have shown that the factors predicting severity of RSV-positive and RSV-negative respiratory failure are similar. We were unable to confirm the findings of previous studies suggesting that these are institute specific. In our institute a high MAP within the first 48 hours provides a predictive factor that can guide the clinician when talking to the parents and may help resource planning. A multicentre study in the UK is unlikely to yield more useful information. During the year 2001, 283 patients were admitted to our intensive care unit (ICU). Forty-one patients had at least one positive blood culture (BC) during their ICU stay. These 14.5% of all patients required 52% of all treatment days. We found a good correlation between the length of ICU stay and the ratio of patients with positive BCs (Fig. 1) and with catheter-related bloodstream infections (CBI) (Fig. 2) . Seventy-four patients (26%) stayed for more than 7 days in the ICU (required 75% of all treatment days). In one-half of these patients a positive BC was found. These 37 patients had a comparable SAPS II score at admission to the 37 patients not getting a positive BC, but they had a doubled ICU mortality and a doubled ICU length of stay (LOS) ( Table 1) . Introduction Severe infections and resistance of microorganisms to broad-spectrum antibiotics constitute a major concern for ICU physicians. Our purpose was to record the microorganisms usually responsible for infections in our ICU and their pattern of resistance to antibiotics during a 6-month period. We performed a prospective study between 1 June 2003 and 1 December 2003, including all patients hospitalized for more than 48 hours. Infections presented during this period were recorded as well as the sensitivity of microorganisms to special antimicrobial agents. A total of 65 patients were included in the study (53 men, 12 women) with a mean age of 54.6 years. The mean ICU stay was 24 days. One hundred and fifty-one infections were recorded: ventilator-associated pneumonia (62.3%), primary bacteremias (22.5%), catheter-related infections (9.4%), meningitis (1.3%) and surgical infections (2%). Microorganisms most commonly isolated were: Pseudomonas aeruginosa (33.7%), Acinetobacter baumannii (28.5%), Klebsiella pneumoniae (14%), and Staphylococcus epidermidis (7.8%). Patterns of resistance were as follows: P. aeruginosa was sensitive to ceftazidime in 16.7% of cases, to imipenem in 38.4%, to ciprofloxacin in 10.2% to piperacillin/ tazobactam in 62.8% and to colistin in 98.7%. A. baumannii was sensitive to imipenem in 61% of cases, to ampicillin/sulbactam in 51.5% and to colistin in 76.5%. K. pneumoniae was sensitive to imipenem in 74% of cases, to piperacillin/tazobactam in 48% of cases and to colistin in 88.8%. Significant resistance of Gram-negative microorganisms to broad-spectrum antibiotics is noticed, posing important therapeutic problems. Change of antibiotic policy and strict hand hygiene are some of the measures that could possibly help us face this ominous situation. Table 1 . Global exhaustiveness was 81.2% for patients staying more than 48 hours in the ICU. False-negative pneumonia was more frequently not documented by microbiology. Also, there was a strong correlation between internal quality of the data (missing or discordant values in surveillance database) and external validity. Conclusion Specificity was high although sensitivity was quite low. BAC was more accurately reported than PN. The lack of sensitivity might be related to factors such as the perception of the surveillance as an external control instrument, the professional background of the data collectors, and poor knowledge of the protocol (e.g. the case definitions). Validation is crucial for credibility of data, provides an opportunity for personal technical assistance, and should be implemented on a continuous basis in surveillance programmes. Conclusions Each patient is subjected to a significant number of CVC interruptions per day, which could be related to blood stream infections and specifically to S. epidermidis due to hand manipulations during CVC interruptions. Available online http://ccforum.com/supplements/8/S1 Conclusions CDC guidelines for the prevention of catheter-related infection notice that the subclavian is preferable to femoral and jugular venous accesses. Our data support these recommendations, but also suggest a new refinement; namely that a peripherally inserted central catheter is preferable to femoral and jugular accesses, and that jugular is preferable to femoral venous access. One of them is the use of antimicrobial-impregnated catheters, but a positive effect has been shown primarily for triple-lumen catheters. Objective To compare the incidence of CVC colonization in two groups of patients using a double-lumem central venous catheter impregnated with chlorhexidine and silver sulfadiazine or the standard one. Patients who undergo insertion of a double-lumen CVC in the ICU were randomized to receive either a central venous catheter impregnated with chlorhexidine and silver sulfadiazine or the standard one. The catheters tips were cultured by the roll-plate method after removal. One hundred and nine patients enrolled with successful insertion of 109 catheters, 51 of them impregnated (group 1) and 58 standards (group 2). There were no statistically significant differences between the groups in age, seven infection-related risk factors, ICU diagnosis, mean SOFA score, insertion sites, duration of catheterization, wrong location at X-ray, signs of allergy, and catheter colonization rates. The mean times of duration of catheterization in group 1 and group 2 were 15.1 ± 9.5 days and 13.5 ± 8.1 days, respectively (P = 0.3). The mean SOFA scores in groups 1 and 2 were 5.04 ± 2.9 and 4.9 ± 3.1, respectively (P = 0.8). The colonization rates were 29.4% (15 catheters) in group 1 and 34.5% (20 catheters) in group 2 (P = 0.5). Thirty-one catheters presented Gram-positive cocci, four of them associated with Gram-negative bacilli and tyree with fungi. Three catheters presented Gram-negative bacilli alone and one catheter presented fungi alone in the roll-plate. This comparative study between a double-lumen central venous catheter impregnated with chlorhexidine and silver sulfadiazine and the standard one did not show any statistically significant difference in colonization rates between the two groups. Patients All incoming patients admitted in a 12-bed digestive (medical and postsurgical) ICU, presenting with an organ dysfunction or a severe inflammatory response to their primary disease (C-reactive protein > 150 mg/l) were routinely scheduled for a weekly screening for fungal colonization. A colonization index (CI) was computed for every patient. In 2000, a pre-emptive antifungal therapy was administered to all patients with CI > 0.5. and subsequently interrupted when CI < 0.5. Due to economic concerns and reports of increasing resistance to antifungal drugs, we changed our therapeutical strategy in 2001 [2] . As a result, in 2002 an antifungal therapy was administered only for probable or patent infections. Evolution of CIs, candidemia and the total cost of therapy are reported for 2000 and 2002. Results See Table 1 . The CI decreased in 92% of patients receiving pre-emptive therapy. In all patients but one, fluconazole was the drug used for pre-emptive therapy. Despite the limits of our study, we can conclude: severely ill medical and postsurgical patients with digestive diseases are at risk for fungal infection and candidemia. Screening for fungal colonization allows an early determination of patients at risk for fungal infection. Pre-emptive treatment efficiently lowers the level of colonization. A small number of costly curative antifungal treatments may prove more expensive than a greater number of pre-emptive cheaper treatments. Introduction and aims Nosocomial bloodstream fungal infections by Candida species in ICU has risen dramatically in the past two decades. In fact, fungal infections in the critically ill patient are difficult to diagnose and are associated with a high mortality rate. The high concentration of some proinflammatory cytokines registered in the serum and in bronchoalveolar lavage (BAL) of patients with severe bronchopneumonia and the different kind of host response during fungal infections are well defined. Our aim is to correlate the IL-10 serum and BAL levels in criticaly ill patients with confirmed bronchopneumonia associated with candidiasis or bacterial sepsis and to research whether these levels consent to prove the fungal origin of severe pneumonia associated with severe sepsis. Involved patients were admitted for more than 48 hours in the ICU. We enrolled 28 patients subdivided into two groups: (group A) 18 patients with severe sepsis with bacterial bronchopneumonia and (group B) 10 patients with sepsis and confirmed candidemia with bronchopneumonia. We registered IL-10 levels in standardized blind nonbronchoscopical bronchoalveolar lavage (b-NB-BAL) and in serum at admission and than 1, 3 and 7 days after treatment. We measured cytokines by an ELISA assay. Results At present we have registered 28 patients aged from 51 to 91 years. Data (mean ± SD) are presented in Table 1 . The value of IL-10 confirms the results obtained by other authors on mice. In humans this correlation could be explained by the suppressive effects of high levels of IL-10 on mononuclear phagocyte activity against candida and methicillinresistant Staphylococcus aureus. In candidiasis with severe bronchopneumonia the precocious increase of IL-10 levels, both in bronchial and in serum, represents confirmation of fungal sepsis and a negative predictive value in order to predict the development of multiorgan failure and death. There were 37 chronic obstructive pulmonary disease patients, nine patients with solid organ transplant recipients, 17 patients with autoimmune diseases, six cirrhosis patients and 20 patients with miscellaneous diseases. Following the EORTC/MSG criteria, the patients were classified as proven IA (n = 30), probable IA (n = 37), possible IA (n = 2) and 'colonization' (n = 20). Mean SAPS II score was 52 with a predicted mortality of 48.6%. Overall mortality was 80% (n = 71). Mortality in the proven and probable groups was 96.7% and 86.5%, respectively. Among the 18 patients who survived, 10 just had 'colonization' with Aspergillus and did not have risk factors for IA. Postmortem examination was done in 47 out of 71 patients (67%) and 29/47 autopsies (62%) showed hyphael invasion with Aspergillus (mainly the lung as target organ). Among the proven cases (n = 30), 29 underwent autopsy (autopsy rate 97%), and one patient with lupus had a positive bronchial biopsy, was treated and survived. The other autopsies were recruited out of the probable group (n = 14, autopsy rate 44%) and the group with 'colonization' for Aspergillus (n = 4, autopsy rate 40%). There were five out of the 30 proven cases who did not have compromising host factors according to the EORTC/MSG definitions (three liver cirrhosis, one pneumonia in a 95-year-old man, one Klebsiella sepsis with multiorgan failure). In conclusion, our study proved that IA is an emerging infectious disease in ICU nonhaemato-oncological patients and there is a broad group of patients, who are at risk of IA. IA was diagnosed in patients without characteristics described in the EORTC/MSG definitions. It seems worthwile to investigate the validity of the available diagnosic tools in that group of patients. Background Evaluation of the clinical utility of aerosolized antibiotics in patients with ventilator-associated pneumonia has been hampered by the inefficiency of available nebulizers. The Aerogen pulmonary drug delivery system (APD) is in development for high-efficiency lung deposition of novel therapies in mechanically ventilated patients. We compared delivery of amikacin (AMIK) via APD vs the the Aeroneb Pro (AP), and vs the AirLife™ Misty Neb™ (MN), in mechanically ventilated patients. We wanted to determine how AMIK concentrations in the sputum relate to device efficiency previously determined in vitro [1] . Methods Twelve patients on volume-cycled ventilation (various primary diagnoses and vent settings) were enrolled. All had purulent secretions, but no fever or X-ray evidence of pneumonia. On separate study days (≥ 2 days washout), patients received sulfite-free AMIK (125 mg/ml): 800 mg via MN, 800 mg via AP, or 400 mg via APD (in anticipation of higher efficiency), in randomized order. Tracheal aspirates were obtained at 15 min postdosing and amikacin concentrations were determined. Data were compared with in vitro measurements of lung dose using typical adult ventilator settings. All treatments were well tolerated. Table 1 presents the evaluable sputum AMIK concentrations from tracheal aspirates normalized to the starting dose (mg/ml/mg starting dose), and in vitro lung dose as the per cent of the total dose (means ± SD). Sputum concentrations were greater with AP and APD than with MN (P < 0.05). For the lung dose: APD > AP > MN (P < 0.05). Conclusions APD delivered AMIK with greater efficiency than the marketed nebulizers. Concentrations of AMIK in the sputum were consistent with relative device efficiency measured in vitro. High efficiency delivery via APD may make aerosolized AMIK a viable part of treatment regimens for ventilator-associated pneumonia. In the safety analysis (n = 395 patients), no difference was found between groups. Twenty patients in group L (10.3%) and 16 in group C + O (8.0%) presented at least one adverse event (AE) considered as treatment-related (P = 0.42). The treatment was withdrawn following the occurrence of an AE considered as treatment-related in 2.6% (L) vs 2.0% (C + O) patients (P = 0.75). Conclusion Levofloxacin (500 mg twice daily dose) is at least as efficacious as the cefotaxime (1 g three times daily) + ofloxacin (200 mg twice daily) combination for the antibiotic therapy of patients with a severe CAP hospitalized in an ICU, with success rates around 80%. Methods Twenty patients received linezolid (600 mg intravensouly every 12 hours). CVVH was performed using highly permeable polysulfone membranes (PSHF 1200, Baxter, Germany and AV 400, Fresenius, Germany). The mean blood flow rate and ultrafiltration rate were 182 ± 15 ml/min and 40 ± 8 ml/min, respectively. Postdilution was performed. Linezolid concentrations in serum and ultrafiltrate were determined by high-performance liquid chromatography. The mean linezolid serum concentration peak (Cmax) was 15.32 ± 3.98 µg/ml, and the mean trough level (Cmin) was 1.87 ± 1.70 µg/ml. The elimination half-life (T1/2) was 4.30 ± 1.74 hours. The total clearance (CLtot), hemofiltration clearance (CLhf) and volume of distribution (Vd) were 9.31 ± 3.48 l/hours, 31.25 ± 12.77 ml/min and 51.30 ± 12.30 l, respectively. Our results indicate that patients with severe Grampositive infections undergoing CVVH can be treated effectively with a dose of 600 mg linezolid every 12 hours. Objective Ceftazidime is used for the treatment of ventilatorassociated pneumonia (VAP) due to its recognized antipseudomonal activity. Standard ceftazidime treatment is by intermittent perfusion (IP); however, continuous perfusion (CP) may be advantageous since its antibacterial activity depends on exposure time at concentrations above the minimum inhibitory concentration. However, evidence of its clinical efficacy in the treatment of VAP as compared with IP is limited and controversial. Thus, the aim of the present study was to determine whether ceftazidime by CP represents an optimization of the VAP standard treatment. Methods We compared a prospective cohort of patients (n = 13) that received 4 g/day intravenous (IV) ceftzadime by CP with a historic control group (n = 32) that received 2 g/8 hours IV ceftazidime by IP. VAP were treated during 14 days with two antibiotics: ceftazidime plus another (aminoglycoside or quinolone). VAP was defined according to the following criteria: a chest radiographic examination showing new or progressive infiltrate; new onset of purulent sputum; significant quantitative culture of pathogen from respiratory secretions (tracheal aspirate > 10 6 cfu/ml, bronchoalveolar lavage > 10 4 cfu/ml or protected bronchial brush catheter > 10 3 cfu/ml). Differences between groups were tested by means of the Student's t test and exact chisquare test by permutation, using the Statxact Software 5.0. A significant diference was defined as P < 0.05. There were no significant differences in age, sex, diagnosis, APACHE II score, etiologic agents, bacteremia, organ dysfunction and antimicrobial therapy between groups. The CP group showed significantly lower clinical failure (CP, 0/13 vs IP, 15/32 [46.87%], P = 0.001) and significantly lower mortality attributable to VAP (CP, 0/13 [0%] vs IP, 8/32 [25%], P = 0.049). In addition, CP patients received one-third less daily dose than those treated intermittently. We conclude that ceftazidime administered by continuous perfusion, for the treatment of ventilator-associated pneumonia, may significantly improve the clinical efficacy compared with intermittent administration and reduce the antibiotic dosage. Background and goal of study Most recently, administration modus for β-lactam antibiotics has become a matter of debate, and continuous infusion has been suggested [1] . We studied whether plasma concentrations of imipenem were sufficiently maintained using a loading dose and continuous infusion regimen, and that this regimen would be superior to a higher amount of drug given by the standard intermittent bolus regimen. We randomized 20 critically ill patients with ICU-acquired pneumonia to receive imipenem either 2 g/24 hours by continuous infusion (CON, n = 10) or bolus dosing (1 g every 8 hours) (BOL, n = 10). In both groups, a loading dose of 1 g imipenem was administered, which was followed by continuous infusion or further bolus treatment 4 hours after loading. Plasma imipenem concentrations were measured at baseline, 4, 10, 16, 22, 46 and 70 hours. Arterial blood samples were taken immediately prior to begin of the continuous regimen and before each respective bolus administration. Patients' age (62 ± 16 vs 59 ± 16 years), SAPS II score (43 ± 12 vs 44 ± 14) and renal function (creatinine clearance, 128 ± 35 vs 122 ± 33 ml/min) were comparable between both groups. Results and discussion After 4 and 10 hours, plasma imipenem concentrations were similar in both groups. However, mean imipenem plasma trough concentrations at the following time points (16, 22, 46, and 70 hours) were significantly higher in CON than BOL. For comparison, at 16, 22, 46 and 70 hours, all patients in CON had concentrations > 2 µg/ml (50% minimum inhibitory concentration for Pseudomonas aeruginosa), while this was only achieved in three of 10 patients in BOL. These data suggest that continuous infusion is advantageous and its benefit should therefore be investigated in clinical outcome studies. Conclusion Continuous imipenem infusion administered as 2 g over 24 hours guarantees more sufficient plasma concentrations when compared with a standard regimen (1 g every 8 hours) that is associated with insufficient trough concentrations. Introduction Delirium occurs in as many as 80% of mechanically ventilated (MV) medical intensive care unit (ICU) patients and is associated with poor outcomes. MV patients spend 43% of all ICU days in delirium, while all patients who become delirious spend about 25% of their ICU days in delirium. The 2002 SCCM sedation and analgesia guidelines call for routine delirium screening and treatment with haloperidol. We sought to understand whether these guidelines represented a change from current practice by examining haloperidol use in the ICUs of a large tertiary care academic medical center. Hypothesis We hypothesize that: (1) haloperidol is the most commonly used antipsychotic medication in our ICUs, and (2) use of haloperidol in patients receiving MV > 48 hours is significantly less than 80% (the rate of delirium in MV patients reported in the literature). Methods From 1 July 2000 to 30 June 2001 we prospectively collected data for all patients admitted to any ICU at the University of Pittsburgh Medical Center, a tertiary care academic medical center with over 120 ICU beds serving medical, surgical, trauma, neuro, and solid-organ transplant patients. We calculated overall rates of treatment with any antipsychotic and generated mean total and daily dose for those who received haloperidol. There were 5592 ICU patients incurring 6033 hospitalizations and 6758 ICU admissions during the study period. Of all hospitalizations, 60.4% required mechanical ventilation and 11.9% received an antipsychotic in the ICU. Haloperidol was the most frequently used antipsychotic (79.5% of all administered doses), with risperidone (4.7%) and olanzapine (2.3%) the next most common. Haloperidol was given to 24.9% of MV > 48 hours patients and on 6.9% of all ICU days. MV > 48 hours patients received mean (SD) daily haloperidol dose of 10.2 (9.5) mg for 5.8 (8.0) days, or 24.0% of their ICU days. Only one in four MV > 48 hours ICU patients received haloperidol. When used, however, the haloperidol dose and duration seemed to be appropriate. Either we do not appropriately treat the majority of delirious patients and the SCCM guidelines represent a significant departure from existing clinical practice, or ICU delirium is not as common as the literature suggests. Further study is needed to define the incidence of ICU delirium in all ICU patient populations and to evaluate the impact of treatment with haloperidol on outcomes of critical care. Introduction Remifentanil hydrochloride (Remi) is being increasingly used in the critically ill. Remi-based analgesia and sedation supplemented with propofol (P) or midazolam (Mid) was compared with a hypnotic-based technique using fentanyl (Fent) or morphine (Morph) in combination with P or Mid for up to 5 days in 161 mechanically ventilated neurotrauma patients. Open-label treatment with Remi, Fent or Morph was randomised 2:1:1. Remi infusion was started at 9 µg/kg/hour and titrated to effect. Supplemental P (days 1-3) or Mid (days [4] [5] was introduced at an infusion rate of 18 µg/kg/hour to provide optimal sedation (SAS 1-3) and analgesia (none/mild pain). Fent or Morph were administered with P or Mid according to routine practice. The mean arterial pressure (MAP), heart rate (HR), intracranial pressure (ICP) and cerebral perfusion pressure (CPP) were measured at the time of and 10 min following dose titrations Table 1 Remi Predictable pharmacokinetics and organindependent elimination allows remifentanil hydrochloride (Remi) to be easily titrated to provide optimal analgesia with rapid dissipation of effects even after prolonged infusions in critically ill patients. Remi-based analgesia and sedation supplemented with propofol (P) or midazolam (Mid) was compared with a hypnotic-based technique using fentanyl (Fent) or morphine (Morph) in combination with P or Mid for up to 5 days in 161 mechanically ventilated neurotrauma patients. Open-label treatment with Remi, Fent or Morph was randomised 2:1:1. Remi infusion was started at 9 µg/kg/hour and titrated to effect. Supplemental P (days 1-3) or Mid (days [4] [5] was introduced at a Remi infusion rate of 18 µg/kg/hour to provide optimal sedation (SAS 1-3) and analgesia (none/mild pain). Fent or Morph were administered with P or Mid according to routine practice. Study drugs were reduced/stopped for daily scheduled assessments of neurological function. From an observed case analysis (on log-transformed data), the overall mean neurological assessment time (unpaired t test) and between-subject variability (F test) around the assessment time were significantly reduced in the Remi group compared with Fent and Morph. Table 1 presents the median (range) time from reducing analgesia/sedation until neurological assessment (hours). The time to assessment of neurological function can be achieved significantly faster and more predictably when using remifentanil-based analgesia and sedation. Table 1 Overall Introduction Remifentanil's titratability and rapid organindependent metabolism make it ideally suited for use in critically ill patients. Establishing safety in this population is important as studies of remifentanil for greater than 3 days are limited. This study compared the safety and efficacy of remifentanil-based analgesia and sedation (RBA) with hypnotic-based sedation (HBS), in 105 ICU patients with medical conditions requiring longterm (up to 10 days) mechanical ventilation. One hundred and five patients were randomised to receive either RBA or HBS. Remifentanil infusion started at 6-9 µg/kg/hour and was titrated to effect to provide optimal analgesia and sedation. Supplemental midazolam (MID) bolus was introduced to remifentanil at a rate of 12-18 µg/kg/hour. HBS (MID + fentanyl or morphine at investigator's choice) was administered according to routine clinical practice. Adverse events (AEs) were recorded throughout the treatment and post-treatment period. No clinical differences were observed in the incidence of adverse events. Seventeen patients received RBA and 16 patients received HBS for at least 10 days. Table 1 presents the AEs. Remifentanil is well tolerated when used for periods of up to 10 days in intensive care patients on mechanical ventilation. No adverse events relating to prolonged recovery from mu-opioid effects were reported. Remifentanil infusion started at 6-9 µg/kg/hour and was titrated to effect optimal analgesia and sedation. A supplemental MID bolus was introduced at a remifentanil rate of 12-18 µg/kg/hour. HBS was administered according to routine clinical practice. The time from the start of treatment to the start of the weaning process and extubation were recorded. The sedation agitation score and pain index were recorded throughout the study. Results Twenty-nine (51%) RBA patients and 16 (33%) HBS patients started weaning during the 10-day treatment period and were extubated. The median percentage time of appropriate analgesia/sedation was greater than 95% with both regimens. Table 1 presents the time from the start of study drug to the start of the weaning period and to extubation. Conclusions Remifentanil-based analgesia and sedation significantly decreased time on mechanical ventilation. Remifentanil may be a useful agent in the cost-effective care of ICU patients. Thoracic surgery usually requires anaesthesia excluding one-lung ventilation (OLV). OLV causes the increase of shunt (Qs/QT), the clinical exponent of which is the lowering of oxygen pressure in blood (and the lowering of SpO 2 ). The defence mechanism, which prevents lowering of oxygen pressure, is hypoxic pulmonary vasoconstriction (HPV) in the lung excluded from ventilation. The anaesthetics used for anaesthesia (especially inhalatory ones) inhibit HPV. Purpose The purpose of this study is the evaluation of the effect of anaesthesia total intravenous anaesthesia (TIVA) (oxygen, air, propofol) on shunt and some haemodynamic parameters in patients undergoing operations connected with neoplastic lung disease requiring OLV. Shunt and haemodynamic parameters were evaluated using the data collected from the catheter introduced to the pulmonary artery (Swan-Ganz catheter). The study was carried out on 11 patients (American Society of Anesthesiologists [ASA]I and ASAII) undergoing planned operations. All the patients were premedicated orally with midazolam. The catheter was introduced to the pulmonary artery before the anaesthetic induction according to the pressure curve. Patients received balanced anaesthesia that was the combination of epidural anaesthesia in the thoracic segment (thoracic epidural anaesthesia: fentanyl + 0.9% saline) and TIVA (propofol according to Roberts' schema). The heart rate, blood pressure, SpO 2 , temperature in the pulmonary artery, pulmonary artery pressure, pulmonary capillary wedge pressure, cardiac output, cardiac index (CI) and Qs/QT were marked at nine intervals: (I) before the induction of anaesthesia (10-30 min); (II) immediately after the induction of anaesthesia (after checking the tube position) -horizontal position; (III) after determining the respiratory mixture (during two-lung ventilation) -lateral position; (IV) 5 min after starting OLV -lateral position; (V) 30 min after starting OLV -lateral position; (VI) 5 min after starting two-lung ventilation or immediately after the pulmonary artery ligation of the operated lung; (VII) 30 min after starting two-lung ventilation or immediately after pulmonary artery ligation of the operated lung; (VIII) 5-10 min before extubation; and (IX) 15 min after extubation. The obtained results were analysed statistically: SpO 2 decreased substantially (IV, V), mean arterial pressure (MAP) (V), and CO (IV) in relation to output values. A statistical increase was noticed in the pressure in the pulmonary artery (II-V, VIII) and Qs/QT (II-VIII), the highest in interval IV and V. Conclusions OLV causes the increase of Qs/QT and, consequently, the lowering of SpO 2 in the 5th min (43.78%) and 30th min (35.4%) after excluding an operated lung from ventilation. OLV is connected with the lowering of SpO 2 and increasing pressure in the pulmonary artery. Propofol used in TIVA causes the increase of shuntapproximately four times. Increasing Qs/QT does not cause the critical lowering of oxygen pressure in arterial blood. During TIVA a transitional decrease of CI and MAP was noticed. Available online http://ccforum.com/supplements/8/S1 Objective The objective of this study was to analyze the incidence of accidental withdrawal of nonintravascular catheters in critical care. Unfortunately, ketamine exerts some side effects. Many researchers use benzodiazepines to cover these undesirable effects. Study design A prospective controlled study. Setting A general ICU. Objective To prove efficacy of ketamine and midazolam as an alternative for analgesia/sedation in asthma patients after polytrauma. Methods Using a model of Jahangir and colleagues [1] , 14 patients with multiple injuries and history of asthma were included during their ICU stay (nine spontaneously breathing, five mechanically ventilated). Patients were randomised into two groups. Group 1 patients received an infusion of ketamine and midazolam. The two drugs were mixed in one syringe (50 mg ketamine and 5 mg midazolam). The mean rate of infusion for ketamine was 150-250 ng/kg/hour and for midazolam was 0.015-0.03 ng/kg/hour. Midazolam was added to eliminate the side effects of ketamine and for sedation. Group 2 patients received an infusion of ketamine only (0.2-0.4 mg/kg/hour). The following parameters of haemodynamics and ventilation were monitored: heart rate, mean arterial pressure, respiratory rate, arterial and end-expiratory PO 2 , arterial PO 2 , transcutaneous oxygen saturation SpO 2 , pulmonary compliance and resistance (in ventilated patients), and subjective perception of pain using the visual analogue scale (VAS 1-10 points), Ramsey sedation scale (wake and sleep levels), and rate of appearance of side effects. Ketamine in a mixture with midazolam led to insignificant changes in hemodynamics and only in group 2 patients were more significant changes noted (tachycardia and mild hypertension). Respiratory function remained well maintained except in some group 2 patients, where mild hyperventilation was noted. We documented an increase in respiratory compliance and a reduction in pulmonary resistance. Midazolam helped to prevent side effects after ketamine administration but after prolonged infusion of midazolam there were signs of an accumulation and an increase in doses was necessary. The VAS score (between 1 and 6) was almost similar in both groups. The Ramsey scale score was also similar (conscious level 3) but with some cases of agitation in group 2 patients. Ketamine and midazolam as a mixture are a good alternative for analgesia/sedation in patients with asthma after multiple injuries. We consider higher doses are needed for better patient comfort. The aim of the study is to evaluate three different methods in order to carry out a magnetic resonance brain scan in children. In our study we analyzed 90 ASA I-II, nonpremedicated, children divided into three groups of 30 patients (A, B, C). We used for all of them the laryngeal mask [1] , inhalatory induction for all three groups and three different kinds of maintenance: inhalatory maintenance, intravenous maintenance and maintenance with balanced anesthesia. The 30 'A' children (age 5.9 ± 4.8 years) underwent inhalatory anesthesia with sevoflurane (1.2 MAC) in spontaneous breathing for induction and maintenance.The 30 'B' children (age 5.9 ± 3 years) underwent inhalatory anesthesia with sevoflurane (1.2 MAC) for induction and endovenous anesthesia with propofol for maintenance [2] . In the 30 'C' children (age 5.1 ± 4.6 years) we used inhalatory anesthesia with sevoflurane (1.2 MAC) for induction and balanced anesthesia for maintenance [2] . We demonstrated that the discharge time using an inhalatory maintenance in spontaneous breathing was shorter than using a maintenance with intravenous propofol anesthesia. One of the halogenates' side effects is the increase of cerebral blood flow. In patients with a reduced intracranic compliance (e.g. brain tumors) a maintenance with a balanced general anesthesia with sevoflurane and propofol could be more appropriate (group C). In all three groups we did not find any significant side effects. In conclusion, we can positively confirm the synergy of both inhalatory and intravenous techniques, joining the advantage of general balanced anesthesia for the patients with a reduced intracranial compliance. Introduction Some critically ill patients cannot absorb enteral medications. We noted that intravenous (IV) Prodafalgan (PRDFG), a recently introduced parenteral bioprecursor of paracetamol, caused hypotension. A prospective study was conducted. The population was ICU patients requiring IV PRDFG (rectal temperature ≥ 38°C, no enteral route). PRDFG (2 g) was infused over 15-20 min. The temperature, heart rate, systolic/diastolic blood pressure and mean blood pressure (MAP) were recorded before, during and 15-120 min following infusion. Interventions to correct hypotension were recorded. Results Seventy-two events, in 14 patients, of PRDFG administration were recorded. MAP dropped significantly 15 min following PRDFG infusion. Initiation or increase of noradrenaline occurred in 25% of the events and fluid bolus was given in 8%. Conclusions PRDFG caused significant hypotension, requiring active intervention in 33% of the events. This data substantiated the anecdotal knowledge on the relationship between paracetamol (acetaminophen) and hypotension in the critically ill and associated it with the IV preparation. This prospective, randomized and controlled study aimed to compare the efficacy of two methods for analgesia in patients with chest trauma and rib fractures: continuous epidural catheter analgesia and regional intrapleural analgesia (RIPA). Seventy-three patients with unilateral chest trauma were randomized into two groups. In Group 1 patients we placed an epidural catheter (level -TH 8-10), and in Group 2 patients we placed an intrapleural catheter (level -fifth intercostal space). We used 1% lidocain for analgesia in both groups. The control group received intravenous analgesia with opioid. We applied 10 ml lidocain bolus to the intrapleural catheter four to six times per 24 hours and clamped the catheter for 10 min. For epidural analgesia we applied 50-100 mg lidocain and started continuous infusion at a rate of 2-3 mg/kg/hour, not exceeding a dose 400 mg lidocain per 24 hours. We monitored the level of thermohipanesthesia, the level of analgesia (Visual Aanalogue Scale -10 points), parameters of respiratory and cardiovascular function (respiratory rate, arterial O 2 oxygen saturation [SaO 2 ], heart rate, arterial blood pressure, central venous pressure, ability to expectorate). Thermohipanesthesia was unilateral and significantly shorter in Group 2 patients, whereas in Group 1 patients it was bilateral and of longer duration. The level of analgesia was similar in both groups (0-4 points) but in Group 2 patients it was achieved by more frequent applications of lidocain, which potentially is dangerous for toxicity. The respiratory rate decreased, O 2 and SaO 2 increased, and the ability to expectorate was improved. In Group 2 patients the heart rate and arterial blood pressure remained stable, and in Group 1 patients there were episodes of tachycardia and hypotension, which is undesirable in trauma patients. The technique of placing an intrapleural catheter is easier. Conclusion RIPA is an acceptable alternative to epidural analgesia in patients with thoracic trauma. Materials Forty-five patients (patients) were divided into two groups. Group I, 22 patients after CA, 17 men and five women, age 64 ± 13 years. CA was caused by ventricular fibrillation in 14 cases and by asystolia in eight. In 16 patients CA appeared in acute coronary syndrome. Fifty percent of patients died during hospital treatment. Group II, 23 patients, aged 60 ± 11 years, 20 men and three women, with stable coronary artery disease. In group I vein blood samples were taken just after CA and on two consecutive days at 8:00 am. In these patients we assessed the concentrations of hormones: adrenocortycotropic hormone (ACTH), cortisol (Cort), renin, aldosterone (Aldo), vasopressin (AVP) and N-terminated natriuretic propeptyde type B (NT-pBNP). In patients of group II, concentrations of the aforementioned hormones were assessed once from a blood sample taken at 8:00 am. Results were analyzed by statistical methods. Results Data (mean ± SD) are presented in Table 1 . In patients after CA who died in hospital, compared with patients after CA who survived, a markable higher concentration of Cort and Renin and lower ACTH and AVP were found. (1) There is strong activation of hormonal mechanisms regulating the water-electrolyte balance and controlling blood pressure in patients after CA. (2) A high concentration of cortisol with a concomitant improper increase of ACTH suggests bad resolution in patients after CA. To clarify endocrine derangements in patients recovering from traumatic or nontraumatic brain injury (BI), 40 patients (31 men) having a mean age of 47 ± 17 years were investigated. BI was due to spontaneous intracerebral hemorrhage (n = 16), trauma (n = 16), ischemic stroke (n = 7) or ruptured brain aneurysm (n = 1). The median Glasgow Coma Score on admission in the ICU was 8 and the duration of mechanical ventilation ranged from 2 to 120 days. Patients were enrolled in the study after being transferred in the rehabilitation unit. Hormonal assessment included the measurement of thyroxine (T4), tri-iodothyronine (T3), thyrotropin (TSH), cortisol, corticotropin (ACTH), prolactin, testosterone, estradiol, insulin-like growth factor I (IGF-1), and dehydroepiadrosterone sulphate (DHEAS). Functional outcome was assessed with the Glasgow Outcome Scale (GOS). In the entire patient population several endocrine abnormalities were observed, including low T3 (n = 5), low T4 (n = 5), high TSH (n = 6) or low TSH (n = 1), high cortisol (n = 4), low ACTH (n = 4) or high ACTH (n = 8), high prolactin (n = 15), low testosterone (n = 11), low IGF-1 (n = 15), and low DHEAS (n = 10). None of the patients had cortisol levels below the local reference range for unstressed individuals. The GOS ranged from 1 to 5 and its distribution was as follows: GOS of 1 (n = 4), GOS of 2 (n = 15), GOS of 3 (n = 11), GOS of 4 (n = 5), and GOS of 5 (n = 5). There were significant correlations between GOS and T3 (r = 0.44, P = 0.005), T4 (r = 0.35, P = 0.02), ACTH (r = 0.43, P = 0.007) and DHEAS (r = 0.34, P = 0.03). In contrast, GOS did not correlate with TSH, cortisol, prolactin, testosterone, estradiol or IGF-1 levels. To conclude, endocrine abnormalities are common in patients recovering from acute traumatic or nontraumatic BI and are related to patients' functional outcome. It remains to be defined whether such hormonal changes are adaptive or reflect pathology. Recently, a reduction of morbidity and mortality of patients in a surgical ICU by maintaining normoglycemia with insulin has been demonstrated [1] . Studies on mitochondria in sepsis [2] and other critical illness and on hyperglycemia in diabetes [3] suggested that the effects of this therapy on mitochondrial integrity and oxidative stress state may contribute to the positive results of the treatment. Twenty liver biopsies obtained postmortem from patients randomized to intensive insulin therapy (IIT) or conventional insulin therapy (CIT) were randomly selected for mitochondrial investigation. Studied patients in the CIT and IIT groups were comparable for age and type, severity and duration of critical illness. The mean blood glucose levels were 10.5 ± 0.6 and 5.6 ± 0.4 mmol/l (P < 0.0001) on a median daily insulin dose of 31 and 45 IU (P = 0.3), respectively. Hypertrophic mitochondria with an increased number of abnormal and irregular cristae and reduced electron-density of the matrix were observed by electron microscopy for seven of the nine patients in the CIT group, in contrast to only one of the 11 IIT patients (P = 0.0018). In addition, significantly higher activities of complex III and complex IV of the respiratory chain and a trend for higher activities of complex I, complex II and complex V and glyceraldehyde-3-P dehydrogenase, an enzyme of which the inhibition by superoxide has been linked to hyperglycemic complications in diabetes [3] , were found in the IIT as compared with the CIT group. In conclusion, maintenance of normoglycemia with IIT appeared to prevent ultrastructural and functional abnormalities of hepatocytic mitochondria associated with critical illness-induced hyperglycemia. These alterations may have contributed to the benefits of the intervention. Further analyses are needed to link the positive effect of IIT on mitochondrial integrity to an effect on oxidative stress state in critical illness. In the nonintensive care setting recognition of patients with elevated Troponin T (cTnT) and the instigation of appropriate treatment, including antiplatelet agents and glycaemic control, reduces the risk of death and myocardial infarction [1] . In a surgical intensive care unit (ICU) (62% cardiothoracic) intensive insulin therapy reduces morbidity and mortality among critically ill patients [2] . Aims/methods A retrospective audit was performed on 180 consecutive admissions to a general (noncardiothoracic) ICU to establish the incidence of raised cTnT (cTnT > 0.1 ng/ml) and treatment prescribed. The outcome measures were antiplatelet use, glycaemic control and all-cause mortality during the ICU admission. The unit protocol stated an insulin infusion should be commenced if blood glucose (BM) > 11.1 mmol/l. The incidence of cardiac myocyte damage, diagnosed by an elevated cTnT, was 62/180 (33%) with an associated all-cause mortality rate of 32/62 patients (51.6%) vs 24/118 patients (20.3%) in cTnT normal patients (P < 0.001). The median length of admission was 5.5 days for patients with a raised cTnT and 3 days for patients with a normal cTnT (P < 0.003). In 70.9% of cases, the raised cTnT occurred within the first 72 hours of admission to the ICU. Of those patients with an elevated cTnT, 10 patients were already on aspirin, two on clopidogrel and 14 had clear contraindications to antiplatelet therapy. Four patients (6%) were prescribed aspirin and 31 eligible patients (50%) did not receive antiplatelet therapy. In patients with an elevated cTnT, 60/62 sets of notes were assessed for glycaemic control. In these patients 38/60 (63.3%) had BM > 11.1 mmol/l and 59/60 (98.3%) had BM > 6.1 mmol/l. A past history of impaired glucose tolerance/diabetes mellitus was present in 12/38 (31.6%) of patients with an elevated BM > 11.1 mmol/l. Insulin therapy was not commenced in 6/38 patients with BM > 11.1 mmol/l. In 19/32 (59.4%) of patients prescribed insulin there was a delay in infusion commencement (median 3.5 hours, range 1-9 hours). Hypoglycaemia (BM < 2.2 mmol/l) was documented in 3/32 patients receiving insulin. The mortality rate was high in all patients with a raised BM (BM 6.1-11.1 mmol/l, mortality 12/21 (57.1%) vs 20/38 (52.6%) for BM > 11.1 mmol/l; NS). The one patient without an elevated BM survived. Conclusions Elevated markers of cardiac myocyte damage are common in critically ill patients and are associated with an increased mortality rate. The use of antiplatelet agents and optimisation of glycaemic control in this group might reduce morbidity and mortality. The aetiology of raised cTnT in the critically ill and specific treatment outcomes requires further investigation. Insulin requirement (in mainly cardiothoracic surgical patients) is suggested as being more strongly associated with ITU mortality than poor glycaemic control [1] . We prospectively recorded insulin administration (soluble human insulin, by infusion; Actrapid ® ; Novo Nordisk) in consecutive general ITU patients admitted over a 1-month period to our unit where guidelines are set to attempt to achieve an arterial blood glucose concentration between 4.5 and 8.0 mmol/l. Patients were subsequently divided into two groups according to whether or not they required insulin during the first 24 hours. Blood glucose was measured using the Radiometer ® ABL System 625 or 700 blood gas analysers. Samples were taken at least every 2 hours. Ninety-eight patients were included in the study: 51 received no insulin, 47 received insulin at some time during the first 24 hours of admission and patients were grouped accordingly. There were no patients with a history of diabetes in the 'no insulin' group and 10 in the 'insulin' group. The mean (SD) number of hours patients who received no insulin were recorded as having a blood glucose > 8.0 was 0.40 (1.41) and for the 47 patients who received insulin 4.74 (4.77). The mean (SD) total insulin dose in the first 24 hours for the 'insulin' group was 38.4 (57.3) IU. Mortality in the 'no insulin' group was 13.1% and in the 'insulin' group 19.1% (chi-square 1.238, P > 0.20). There were no deaths among the previously diagnosed diabetic patients. When these patients were excluded from the analysis the mortality was 13.1% and 32.1%, respectively (chisquare 2.78, P < 0.10, P > 0.05). The mean (SD) length of stay in the ITU was 4.12 (6.90) days in the 'no insulin' group and 6.77 (8.89) days in the 'insulin' group (one-tailed t test, unequal variance, P = 0.05). Conclusion An increased length of stay in ITU is predicted if insulin is required to maintain blood glucose < 8.0 mmol/l in the first 24 hours of ITU admission. An increased mortality may be predicted if insulin is required to maintain blood glucose < 8.0 mmol/l in the first 24 hours of ITU admission in nondiabetic patients. Introduction Elevated blood glucose concentration is an important factor for mortality and morbidity in critically ill patients. Previously, a randomized controlled trial showed that a blood glucose level of about 6 mmol/l is associated with less multiorgan failure and a significantly higher survival rate compared with levels above 8 mmol/l. Platelets play a crucial role in hemostasis and inflammation. However, the effect of short-term elevated blood glucose levels on platelet activation has not yet been evaluated systematically. Objective To evaluate the influence of blood glucose levels on platelets in vitro. Methods Citrated blood samples were drawn from healthy blood donors (40% male, age 38 ± 13 years [mean ± SD]). Exclusion criteria were smoking and the use of drugs interfering with platelet function. Blood samples were adjusted with glucose (Sigma, Taufkirchen, Germany) to final concentrations of 5 mmol/l (control group), 10 mmol/l (group 1) and 15 mmol/l (group 2), respectively. Samples were incubated for 10 min at 37°C with fluorescencelabeled monoclonal antibodies against CD62P, CD41, CD36, or CD42b (all: Beckman-Coulter, Krefeld, Germany). To evaluate platelet reactivity 2 and 6 µM thrombin-receptor-agonist-peptide-6 (TRAP-6; Bachem, Heidelberg, Germany) or 5 and 10 µM adenosine-di-phosphate (ADP; Sigma, Taufkirchen, Germany) were added. Analyses were performed in a flow-cytometer (Epics XL; Beckman-Coulter). The mean fluorescence intensity was calculated. Determination of platelet aggregation was performed by the turbidimetric procedure (BCT, Dade Behring, Marburg, Germany). Aggregation was induced with ADP (200 µM/l), collagen (2 mg/l) and epinephrine (100 µM/l; all Dade Behring). Statistics for intergroup differences were performed by one-way ANOVA. The initial blood glucose concentration was 5.0 ± 1.1 mmol/l. The blood glucose level had no significant influence on the expression of CD36 and CD62P, with and without stimulation. By contrast we observed a significant decrease in expression of CD42b in group 2 compared with the control group (unstimulated, P < 0.001; TRAP-6, P = 0.005; ADP, P = 0.012). A similar observation was made for CD41 expression (unstimulated, P < 0.001; TRAP-6, P < 0.05; ADP, P < 0.001). Also in group 1, a significant decrease in CD41 expression was observed after ADP stimulation. No significant differences were seen by aggregometry with either agonist. This in vitro study demonstrates that elevated blood glucose levels reduce expression of platelet receptors CD41 and CD42b. In contrast, platelet function measured by aggregometry showed no impairment. We conclude that acute hyperglycemia does not lead to lowered platelet function. Thirty-four patients in whom blood glucose (BG) levels were controlled by means of the artificial pancreas (aim of BG control: 150 mg/dl) were investigated. The first measurement of GC was performed in acute conditions for all the patients, and the second measurement was done 1 week after the first measurement for 20 patients. GC was performed with a clamped BG level of 80 mg/dl and an insulin infusion rate (IIR) of 1.12 and 3.36 mU/kg/min. M1/M3 and I1/I3 indicate the M value (mg/kg/min) and blood insulin level (µU/ml) when the IIR is 1.12/3.36, respectively (normal value of M1: 5-10). M1/I1 and M3/I3 were calculated as the indicator of insulin sensitivity (IS). Patients were classified in four groups (A, B, C, D) according to M1 and the I/E ratio (mU/kcal): A, M1 < 5 and 30 < I/E ratio; B, M1 < 5 and I/E ratio < 30; C, 5 < M1 and I/E ratio < 30; D, 5 < M1 and 30 < I/E ratio. The following parameters were studied: (1) administered energy (glucose) (E), (2) administered insulin (I), (3) SOFA score, (4) M value, (5) daily mean BG level (BGm), (6) blood C-peptide reactivity (CPR), (7) IS, (5) insulin clearance (IC) (ml/kg/min). (1) There was negative correlation between the I/E ratio and M1/M3 (r = -0.31/r = -0.44). (2) Comparison between A (M1, 3.3 ± 1.1; I/E ratio, 55 ± 20; n = 23) and B (M1, 3.5 ± 0.8; I/E ratio, 18 ± 9; n = 11): there was significant difference in I (P < 0.005), BGm (183 ± 16 vs 153 ± 18, P < 0.005), and IC (19 ± 7 vs 14 ± 4, P < 0.05), but no significant difference in E, M1, CPR, IS, and SOFA score. (3) Comparison between C (M1, 7.3 ± 1.7; I/E ratio, 12 ± 8; n = 15) and D (M1, 7.8 ± 2.2; I/E ratio, 62 ± 44; n = 5): there was a significant difference in I (P < 0.005) and BGm (145 ± 21 vs 168 ± 16, P < 0.05), but no significant difference in E, M1, M3, CPR, IS, IC, and SOFA score. Interpretation The I/E ratio was a daily measurable indicator of glucose tolerance. However, there was discrepancy between the M value and the I/E ratio in some patients (groups B, D). the mechanism of the discrepancy was unclear, but the influence of IC and/or an increase of glucose metabolism by BG itself (ex. mass action effect, activation of glucose transporter-2, etc.) in group B, and a decrease of that in group D was speculated, because the M value was measured under a BG level of 80 mg/dl, while the I/E ratio was under BG control aiming at 150 mg/dl. Measurement of both the M value and the I/E ratio was considered to be useful not only for the evaluation of glucose tolerance, but also for further understanding of the mechanism of glucose intolerance in acutely ill severe patients. 'Tight' glycaemic control in perioperative or critical ill patients may carry the risk of hypoglycaemia. However, a blood glucose target of 4.5-6.1 mmol/l has been shown to benefit critically ill, mainly postcardiothoracic surgery patients where, unusually, all patients were given glucose infusions from admission (200-300 g/24 hours). Of the 'tight' group, 5.2% had inconsequential hypoglycaemic episodes (blood glucose < 2.2 mmol/l). The perceived risk of hypoglycaemia in starved patients receiving insulin to achieve 'tight' glycaemic control is a widespread concern. We report safety monitoring in our ongoing prospective, double-blind, randomised controlled study (the Does Additional Glucose Make A Difference? trial) investigating whether initial additional glucose infusion improves outcome in critical care patients receiving a 'tight' glycaemic control. Patients received 50% glucose or 0.9% NaCl at 20 ml/hour until full nutrition was taken. We monitored for excess hypoglycaemic episodes in our NaCl group. We set a 5% acceptable incidence of blood glucose < 3.0 mmol/l and 0% for adverse consequences. Hourly arterial line samples were tested by regularly calibrated Accu-check ® (Roche Diagnostics) bedside monitors. Insulin (Actrapid ® ; Novo Nordisk), 50 U in 50 ml of 0.9% NaCl, was administered by continuous infusion and boluses according to an algorithm. The study period was the time that study infusions were given. Investigators remained blinded. Complete data was obtained from 113 patients (63 and 50 in each group) of 127 who gave informed consent according to local medical ethics guidelines. No adverse incidents or deaths were recorded in patients with incomplete data. There were no differences between the groups in (group 1 [mean, SD], group 2 [mean, SD]): age (66.7, 14.9), (67.1, 12.7), body mass index (77.3, 16.2), (79.8, 12.4), APACHE II score (13.8, 12.2), SOPRA (30.4, 12.2), (33.3, 10.5), admission reason (87%, 92% cardiac surgery) or death in the ITU (3.2%, 2%). Total hypoglycaemic (< 3.0 mmol/l) and hyperglycaemic (> 12.0 mmol/l) episodes (total hours of study period) and mean (SD) hours outside the prescribed range (4.5-6.1 mmol/l) for each patient during the study period are presented in Table 1 . Tight glycaemic control appears safe in patients receiving either 50% glucose or 0.9% NaCl at 20 ml/hour. Severe injury with activation of the systemic inflammatory response syndrome stimulates the release of cortisol from the adrenal cortex. Relative cortisol insufficiency is well described in critically ill patients, especially in severe sepsis, and is associated with increased mortality. There is conflicting data regarding the effect of major trauma and haemorrhagic shock on endogenous cortisol levels and the effects of early relative cortisol deficiency. We measured random cortisol levels in 20 patients with severe trauma and haemorrhagic shock who were admitted to the ICU after initial resuscitation and emergency surgery. Injury severity scores (ISS), fluid and blood product requirements, inotrope requirements after fluid resuscitation, and mortality at 28 days were also measured. Relative cortisol deficiency was defined as a random cortisol level < 400 nmol/l. Six out of 20 (30%) patients had a cortisol level > 400 nmol/l. This group of patients had a mean (± SD) ISS of 18 (± 7) and a mean cortisol level of 606 nmol/l (± 155). None required inotropic support, and mortality was 0/6 (0%). Fourteen out of 20 (70%) patients had a cortisol level < 400 nmol/l. This group of patients had a mean ISS of 28 (± 7.5) and a mean cortisol level of 253 (± 89). Six out of 14 (43%) required inotropic support, and the mortality was 3/14 (21.5%). These patients also had significantly higher fluid and blood product requirements. Conclusion Cortisol deficiency is common in patients after major trauma and is associated with a higher ISS, increased fluid and blood product requirements, increased inotrope requirements, and increased mortality. These patients may benefit from early steroid replacement therapy. The objective of this study was to investigate the functional integrity of the hypothalamic-pituitary-adrenal axis in critical illness by stimulating with the low-dose ACTH stimulation test (LDST) and hCRH. The study included 16 (15 male) mechanically ventilated patients, having a mean age of 52 ± 19 years. Underlying diagnoses included major operation (n = 7), multiple trauma (n = 5), stroke (n = 3), and pancreatitis (n = 1). Patients were enrolled in the study 3-14 days after initiation of mechanical ventilation. Median APACHE II and SOFA scores at the study day were 13 (range: 8-23) and 5.5 (range: 4-12), respectively. All patients underwent stimulation first with the LDST (1 µg) and then on the following day with 100 µg hCRH. ACTH and cortisol concentrations were determined from -15 to 120 min after hCRH. Normal cortisol responses to the LDST and hCRH were defined as peak plasma concentrations above 18 µg/dl and 20 µg/dl, respectively. An appropriate ACTH response to hCRH was considered if a twofold increase in ACTH was observed. In the entire patient population, baseline cortisol (mean ± SD) was 15.1 ± 7.1 µg/dl and stimulated cortisol (median) was 21.4 µg/dl following the LDST. Four (25%) of the 16 patients had subnormal stimulated cortisol levels after the LDST. These four patients also had subnormal peak cortisol concentrations following hCRH. Analysis of the individual ACTH responses in the four patients revealed two patterns: in three patients a twofold to eightfold increase in ACTH was noted, whereas one patient failed to augment appropriately ACTH levels (peak ACTH was 1.60 times the baseline value). In conclusion, a significant (25%) subset of critically ill patients has evidence of diminished cortisol production following dynamic stimulation with the LDST. This disorder is mostly due to primary adrenal dysfunction, but also to hypothalamic-pituitary failure. The aim of the study was to determine the status of the hypothalamic-pituitary-adrenal axis in critically ill patients with early sepsis and/or septic shock and to investigate whether adrenal responses are related to mortality. Forty-two patients (32 male; median age 62 years; range 17-82 years) had cortisol, corticotropin (ACTH) and dehydroepiandrosterone sulphate (DHEAS) levels measured at onset of sepsis and/or septic shock. Adrenal responsiveness was assessed by the LDST. A peak cortisol < 18 µg/dl on the LDST was considered as representing an inadequate response. For the entire patient population, hormone concentrations were as follows (median or mean ± SD values): baseline cortisol 17.8 µg/dl, stimulated cortisol 24.8 ± 9.4 µg/dl, increment in cortisol 5.9 ± 4.4 µg/dl, ACTH 21.2 pg/ml and DHEAS 1553 ± 1157 ng/ml. Eight (19%) of the 42 patients had inadequate cortisol responses following the LDST. Overall, 21 patients died and 21 patients survived. There were no differences between survivors and nonsurvivors with regard to baseline cortisol (17.2 vs 18.9 µg/dl, P = 0.20), stimulated cortisol (23.5 vs 25.1 µg/dl, P = 0.94), ACTH (20.1 vs 26.1 pg/ml, P = 0.48) or DHEAS (1754 ± 1333 vs 1352 ± 939 ng/ml, P = 0.26) levels. In contrast, nonsurvivors had a lower increment in cortisol following the LDST compared with survivors (4.2 ± 3.5 vs 7.5 ± 4.7 µg/dl, P < 0.05). In conclusion, a substantial (19%) proportion of patients has evidence of adrenal hyporesponsiveness at onset of sepsis and/or septic shock. Attenuated adrenal responses are associated with a higher mortality rate in such patients. Introduction Patients with septic shock were recently found to have relative adrenal insufficiency. Etomidate is known to inhibit 11-beta hydroxylase, which could interfere with the cortisol response to corticotropin. Hypothesis Patients who receive Etomidate before the corticotropin stimulation test will have higher incidence of relative adrenal insufficiency. In this retrospective study, the electronic records of the 1207 patients who had serum cortisol levels drawn between March 2002 and August 2003 were reviewed and 163 patients who had a short corticotropin stimulation test done during an ICU admission because of septic shock were identified. Our cohort was divided into those who received Etomidate or not before the corticotropin stimulation test. The incidence of relative adrenal insufficiency was compared between these two groups. Data collected included demographics, presence of relative adrenal insufficiency, presence of coagulopathy, the use of steroids or any medication known to interfere with cortisol synthesis, use of Etomidate and the time interval between the administration and the cosyntropin test, history of adrenal/pituitary disease, and mortality. Relative adrenal insufficiency was defined as a response of 9 µg/dl or less after corticotropin stimulation. Septic shock was defined by the ACCP/SCCM criteria. Comparisons between groups were made using the chi-square test. The patient mean age was 64.6 years; 97% were white, and 58% were male. Coagulopathy was found in 66% of the patients. Ten patients were on dexamethasone before the test. None of the patients were on any medication known to interfere with the test. There was no patient with previous history of adrenal/pituitary disease. Of the 46 patients who received Etomidate, 36 (78%) were diagnosed as having relative adrenal insufficiency compared with 58/117 (50%) patients who did not (P = 0.0008). Relative adrenal insufficiency was noted in 79% of the patients who received Etomidate within 6 hours compared with 78% of the patients who received Etomidate later than 6 hours of the test (P = 0.9246). The mortality rate was 53% (50/94) in patients with relative adrenal insufficiency compared with 61% (42/69) in patients without relative adrenal insufficiency (P = 0.3287). There is an increased incidence of relative adrenal insufficiency after Etomidate administration in septic shock following Etomidate administration. This increased incidence is unrelated to the time interval between Etomidate administration and the corticotropin test. Background and goals Chronic obstructive pulmonary disease (COPD) has been increasingly recognized as a systemic disease. The hormonal, metabolic and musculoskeletal implications of the generalized processes involving oxidative stress, inflammatory mediators, cytokines, and endocrine hormones have only begun to be understood. The aim of this study was to assess of changes in oxidant stress during the treatment of exacerbation of COPD. We measured erythrocyte (E) and plasma (P) glutathione peroxidase (GPx), malondialdehyde (MDA) and superoxide dismutase (SOD) on the pretreatment period and the post-treatment period in 20 patients with acute exacerbation of COPD. Twenty healthy smokers and 20 nonsmokers having no history of lung disease served as control subjects. Results Data are presented in Tables 1 and 2 . Our results were not consistent with early reports about the same topic. We thought that treatment does not increase the antioxidant level every time, although the clinical improvement is achieved. Therefore, our results suggest that, considering an increase in antioxidant capacity shows tissue improvement, GPx, SOD and MDA levels can be used as a marker of prognosis and of the success of treatment of the exacerbation of COPD. Objective To assess antioxidant enzyme activities in critically ill patients with sepsis compared with control patients without sepsis and to evaluate changes of antioxidant enzyme activities in the convalescent period after successful treatment of sepsis. It is a prospective case-control study in a medical adult intensive care unit with 13 beds. Routine venous blood samples were obtained from septic patients (n = 16, age 26-80 years) with APACHE II score > 10 without antioxidant treatment and from 16 age-matched, sex-matched hospitalised control patients without clinical and laboratory signs of sepsis. Paired convalescent samples taken 1 week after recovery were available in five patients. We measured the activity of superoxide dismutase with copper and zinc (EC 1.15.1.1, CuZn-SOD) in erytrocytes, based on superoxide generation by xanthine oxidase and tetrazolium reduction. The activity of paraoxonase (EC 3.1.8.1, PON1) in serum with paraoxon and phenylacetate, and the serum concentrations of total, high-density lipoprotein (HDL) cholesterol and C-reactive protein were also measured. We considered P < 0.05 to be statistically significant. There was higher CuZn-SOD activity (mean ± SEM, 30,332.5 ± 2369.9 U/g Hb) in septic patients in comparison with healthy controls (23,192.2 ± 1078.7 U/g Hb; P = 0.01). On the other hand, PON1 activity measured with paraoxon (5.58 ± 0.71 U/ml) resp. phenylacetate (17,642.3 ± 1501.1 U/ml) was lower in sepsis when compared with controls (9.55 ± 0.93 U/ml resp. 29,181.1 ± 2198,6 U/ml; P < 0.01 resp. P = 0.001). After recovery there was no difference in activity of both CuZn-SOD and PON1 of patients and controls. Moreover, we found positive correlation between activity of PON1 and level of total and HDL cholesterol in sepsis and after recovery. We observed no age and sex dependence in activity of both enzymes in septic patients and controls. Conclusion Patients admitted to the intensive care unit with severe sepsis exhibit abnormal anitoxidant enzyme activities. Purpose To assess the efficacy of scavengers on lung injury during intra-abdominal sepsis in rats. The Sprague-Dawley rats (male, 200-250 g) underwent cecal ligation and puncture (CLP); the cecum was punctured with a 18-gauge needle through and through to obtain two holes, and gentle pressure was applied on the ligated cecum to exteriorize a small amount of feces, followed by closure of the abdominal incision. Five hours later, these animals were then injected intraperitoneally with polyethylene glycol absorbed catalase; scavengers of hydrogen peroxide (A group), dimethyl sulfoxide; scavenger of hydroxyl radicals (B group), polyethylene glycol absorbed superoxide dismutase; scavengers of superoxide ions, or saline (control group). Survival was evaluated for 24 hours after CLP, and then surviving animals were sacrificed with carbon dioxide inhalation for outcome. The lung protein synthesis was evaluated with a nitrotyrosine/tyrosine ratio and oxidative damage in DNA was evaluated with the 8-oxodeoxyguanosine/ deoxyguanosine ratio in the homogenized lung tissue, which were measured by HPLC. Neutrophil accumulation in the lung was evaluated with myeloperoxidase activity (MPO), intracellular adhesion molecule-1 (ICAM-1) expression on these cells in the lung was evaluated with pathological examination. Results MPO, ICAM-1 expression, the nitrotyrosine/tyrosine ratio and the 8-oxodeoxyguanosine/deoxyguanosine ratio were significantly lower in all the scavenger-treated groups (A, B, C) than in the saline-treated group. However, there was no difference in the survival rate in all the groups (A, 33.3%; B, 20%; C, 0%; control, 27.7%). Conclusion Scavengers attenuate neutrophil accumulation and lung injury, but do not improve survival. These results suggest that factors other than lung injury may be responsible for mortality in this rat model of sepsis. However, additional studies of both superoxide in the early phase and late assessment of outcome are indicated. Conclusions As SH after cardiac surgery appears to be common and associated with major morbidity, levels should be routinely monitored. Available online http://ccforum.com/supplements/8/S1 In everyday practice we observe a difference between the potassium levels measured by routine biochemistry and by the gas analysis machine. This is expected since the former measures serum potassium while the later measures plasma potassium. The serum/plasma potassium concentration difference is related to the patient's platelet count and increases to statistically significant levels in thrombocytaemia.We measured the serum/plasma potassium difference of ICU patient within the first 6 hours of their admission and compared it with the difference measured in chronically severely ill (uremic) patients, as well as a group of healthy volunteers. We measured the serum potassium (SKA), plasma potassium (PKA), serum/plasma potassium difference (SPdif-A = SKA -PKA), age and platelet (PLT) count in 50 ICU patients (Group A). They had been previously healthy and had suffered a major catastrophe (road traffic accident, gastrointestinal bleed, major trauma) less than 6 hours from the measurements.We also measured the serum potassium (SKB), plasma potassium (PKB), serum/plasma potassium difference (SPdif-B = SKB -PKB), age and PLT count in 31 chronic uremic patients on renal replacement therapy (Group B). The patients had come for treatment as scheduled, with no evidence of any other acute disease. The same variables (SKC, PKC, SPdif-C) were measured in 20 healthy volunteers (Group C). Blood samples were obtained from the radial artery. Serum potassium was collected in a Vacutainer tube, Plasma potassium in a syringe treated with heparin, and platelet counts were measured in EDTA-treated plasma samples. Potassium level were measured using the same equipment. Conclusions We suggest that plasma potassium is more reliable than serum potassium in the critically ill, since the latter depends on the platelet count. We should evaluate the patient and act according to the plasma levels, since pseudohyperkalemia is more likely, even for normal platelet counts.The higher Spdif observed in the critically ill implies that a number of their platelets is not counted with conventional methods, probably because they are activated and aggregate during the initiation of the acute phase reaction. Myelinolysis (ML) may be related to an excessively rapid correction of hyponatremia, but no study has clearly demonstrated this assumption. We prospectively analyzed the determinants of ML following correction of severe hyponatremia. All patients admitted to a 10-bed university medical ICU between July 1995 and March 2003 with hyponatremia (< 120 mmol/l) were included. A cerebral computerised tomography (CT) scan was performed within 2 days, and a cerebral magnetic resonance imaging (MRI) scan was systematically planned at 1 month. ML diagnosis was ascertained when hyperintense lesions were present on the T2 sequence in any area considered normal on the initial imaging. Clinical and biological data were assessed every 6 hours until correction of natremia. Therapeutic options were left to the discretion of the intensivists. Objectives To test the hypotheses: (1) that chloride accounts for an increasing proportion of the base deficit with treatment in diabetic ketoacidosis (DKA), and (2) that the perceived discrepancy between changes in the anion gap and bicarbonate/ base deficit during treatment is primarily due to the chloride effect. Methods A retrospective cohort study of children < 16 years (n = 18) admitted for acute management of DKA to two paediatric ICUs. Stewart's physicochemical theory was used to calculate the independent effect of chloride on the bicarbonate and base deficit via a linear regression model. This model was then used to: (1) quantify the effect of chloride on the base deficit, and (2) evaluate the relationship between change in the anion gap and both bicarbonate and base deficit before and after correction for chloride. Results Eighteen children (median age 12.7 years, weight 43 kg) were followed for 20 hours after initiation of therapy (insulin and fluid resuscitation). There was a steady improvement in pH over this time (mean pH 6.97-7.31). However, at 20 hours a significant base deficit persisted (mean 10.1 mmol/l), despite the anion gap having normalised (mean 33.7 to 16.4 mmol/l). The base deficit at this time was almost exclusively due to hyperchloraemia (98%). The relationship between changes in the anion gap and both bicarbonate and base deficit improved dramatically, approaching one-for-one after correction for chloride (slope 0.99, r 2 = 0.96 and slope 1.14, r 2 = 0.95, respectively). Chloride has a confounding effect on the interpretation of the base deficit and bicarbonate during the treatment of DKA. This does not occur with the anion gap, which may be a better marker to track resolution of ketoacidosis and should therefore be incorporated into treatment guidelines. Setting A 10-bed specialized ICU in the Belorussian Center for Paediatric Oncology. Objective To estimate the influence of nutritive status on the severity of septic complications and mortality in children with neutropenia. We have followed prospectively 104 cases of sepsis (33 cases of sepsis, 36 cases of severe sepsis and 35 cases of septic shock, according to ACCP/SCCM criteria) among neutropenic children aged from 2 to 16 years, with myeloproliferative diseases and solid tumors. The mean PRISM II score at the time of ICU admission for patients with sepsis was 9.17 ± 5.9, for severe sepsis was 19.46 ± 5.92, and was 28 ± 7.29 for patients with septic shock. We compared the percent of weight loss during the period of chemotherapy before sepsis onset among survivors and nonsurvivors. The control group included 25 children with leukemia and lymphoma, matching by age, gender and duration of treatment, and without septic complications. For the statistical analysis a t test was used. Results A decrease of body mass between 5.2% and 28% (mean 11.7 ± 6.45% weight loss during the previous 3.5 months) was found in 75.8% patients with sepsis, 77.7% with severe sepsis and 87.5% with septic shock. The differences in nutritional condition among survivors and nonsurvivors with weight loss and sepsis are summarized in Table 1 . In the control group weight loss was found in 37.1% patients, and the mean level of weight loss was 4.05 ± 1.48%, which was significantly (P < 0.05) lower than in patients with sepsis. There was no mortality in the control group. The mortality rate in patients with sepsis was 21.2%, in severe sepsis was 61%, and in septic shock was 85.7%. Significant differences (P < 0.05) in weight loss between survivors and nonsurvivors were found in all groups of patients. Weight loss during chemotherapy for cancer is a risk factor associated with an increase of morbidity and mortality Objective To determine the changes in resting energy expenditure (REE), body temperature and jugular bulb oxygen saturation (SjVO 2 ) of the patients with brain injury or brain death. Methods Fifty-two patients with Glasgow Coma Scale score < 6 admitted to our intensive care unit between October 2002 and November 2003 were included in the study. Among these patients Group 1 (n = 26) included patients with brain death. This group was divided into two subgroups later. Group 1a (n = 13) consisted of the patients with brain death when they were included in the study and Group 1b (n = 13) consisted of patients who progressed to brain death during their intensive care unit stay although they were initially not brain dead. Group 2 (n = 26) consisted of the patients with brain injury but no brain death. REE using indirect calorimetry, SjVO 2 and body temperature was recorded daily and simultaneously during the first 5 days of the study. REE values were expressed as the percentage of basal metabolic rate (BMR%) calculated using the Harris-Benedict equation. There were no differences in terms of age, APACHE II score at admission, reason for coma and BMR between Group 1 and Group 2 (P > 0.05). Mean body temperatures were 35.6 ± 0.9°C and 37 ± 0.6°C (P < 0.01), mean SjVO 2 values were 90.3 ± 9.9% and 77.9 ± 10% (P < 0.01) and mean REE values were 1542 ± 580 kcal (97 ± 26.8% of mean BMR) and 1963 ± 600 kcal (117 ± 29.2% of mean BMR) (P < 0.05) in Group 1 and Group 2, respectively. In Group 1b, the mean body temperature was lower and the mean SjVO 2 was higher than the values before brain death (P < 0.05). In this group, although the mean REE was lower than the value before brain death, this difference was not statistically significant (P = 0.07). In this study, we found that the mean value of REE was 17% higher than the BMR in the patients with brain injury. The mean REE and body temperature was lower and the mean SjVO 2 was higher in the brain-dead patients than in the patients with no brain death. Reduced translocation of potential pathogen bacteria in the gut and changes in the immune system may be responsible for this appearance. The aim of this randomised double-blind trial was to investigate the changes in concentration of cytokines of patients receiving either EEN enriched with LAB or placebo. Patients and methods Thirty-three patients undergoing either pylorus-preserving pancreaticoduodenectomy or Whipple's operation were enrolled. EEN enriched with either LAB (Synbiotic 2000™) (n = 17) or placebo (n = 16) was supplied after doubleblind randomisation for a period of 5 days beginning the day before the operation. Samples were taken before surgery as well as postoperatively on days 1, 4 and 8. The plasma samples were centrifuged and quickly frozen to -70°C. The concentration of cytokines was measured by the specific ELISA technique (OptEIA Human set, Pharmingen, USA) in standard procedure. Results IL-6 increased in both groups until day 1, significantly. In further progress a decrease could be seen. In contrast, IL-12 decreased after the operation and increased until day 8 to nearly the same level as at the onset of the investigation (placebo group 36.27 ± 48.67 pg/ml; verum group 56.58 ± 69.43 pg/ml). Simultaneously, the amount of interferon gamma (IFNγ) decreased postoperatively in the verum group and increased in the placebo group. But all differences seen were only tendencies, no significance could be shown. The concentration of IL-10 increased after the operation. However, there was a higher increase from the day before surgery until day 4 in the verum group than in the placebo group. Summary and conclusion Enrichment of EEN with LAB seems to have no significant influence on the known postoperative increase of IL-6 or the decrease of IL-12 concentration. In contrast, the LAB supply seems to impair the amount of IL-10 and IFNγ. Further investigations are necessary to elucidate the underlying mechanisms. In modern intensive care medicine there is a strive for enteral nutrition (EN) since it has been shown to protect gut mucosal function, to reduce infective morbidity, to hasten recovery from illness and to contribute to a lower mortality rate. EN requires a functioning, intact gastrointestinal tract, may cause diarrhoea and has an attendant risk of pulmonary aspiration. Further, data indicate that discrepancies between prescription and delivery of EN carry a risk of undernutrition. We therefore designed this study, with the aim to identify discrepancies between prescribed and delivered nutrition and to evaluate benefits and problems associated with EN and parenteral nutrition (PN). We also compared the actual amounts of fat, glucose and nitrogen delivered with calculated requirements. Methods Data on nutritional supply (energy intake, amount glucose, fat and nitrogen) and gut function (diarrhoea, gastric retention and vomiting) were registered daily for all patients admitted to the ICU during 6 weeks. Patients treated for less than 3 days were excluded. Energy requirements were calculated according to the Harris-Benedict equation and the nitrogen (N) demand was set as 0.15 g/N/kg. Data are based on 267 treatment days, and are presented as mean ± SEM. Results Twenty-six out of 74 patients stayed in the ICU for > 3 days, and were thus included in the study. On average, patients received adequate amounts of energy (around 1500 kcal/day) from the third day, and throughout the study period. Of the energy delivered, fat accounted for 20-30%. The daily N supply was approximately 14 g. There was a good correlation between prescribed and administered amounts of enteral nutrition (R 2 = 0.90). Of the total amount of energy delivered 53% was administered enterally. Bowel function was considering normal for 151/241 days (63%). Daily registrations of gut function demonstrated diarrhoea (11%), gastric retention (29%) and vomiting (2.5%). Conclusions EN was gradually increased, according to our nutrition guidelines, and predominated after the first 7 days. The energy input was usually adequate within 3 days after admission. EN was generally well tolerated with few complications. Diarrhoea was most frequent) in the enterally fed patients and gastric retention was common among patients given PN. The correlation between prescribed and received amounts of enteral feed was high. Background The aim of this study is to develop a standardized procedure for assessing patients' nutritional needs and providing individualized enteral nutrition (EN) regimens by using a software tool that can accomplish this task in a fast, safe, yet simple way. The study took place in the Surgical Intensive Care Unit of the University Surgical Department of Hippocration Hospital. The procedure involves input of patient data such as age, weight, and sex, along with data relevant to the clinical status of the patient, as well as treatment details. Specialized software can then analyze this data, determine nutritional parameters by implementing evidencebased equations, and calculate accurately an individual patient's daily nutritional needs in total energy, protein, hydrocarbons, fat, vitamins, minerals as well as the concentrations of each nutrient and total EN volume that will be administered on a 24-hour basis. The software was designed for monitoring the administered regimen and alarm in any case of mismatched calculation (according to the underlying disease, the clinical status and the nutritional parameters of the patient), and for informing the user about any metabolic complication that is likely to occur (according to the laboratory tests of the patient). For the purpose of this study, computer-calculated EN regimens were administered to 30 randomly selected patients in our unit and were then compared with the regimens, which would be administered without the help of the computer. There were large differences in EN calculated by the computer, while a number of times the computer designated the use of a different preparation in order to better suit the patient's nutritional needs. There was a reduction of the time consumed for the calculations (65%), and a decrease in false calculations (20%), whereas the early recognition of metabolic complications increased to 40%. The utilization of specialized software seemed to be able to help health professionals to select an optimal EN regimen and to estimate the appropriate fluid volume according to patient needs. Conclusion Implementation of this software enables health professionals to overcome the burden of calculations, while it can also accomplish labeling, statistic analysis, and record management, thus allowing the provision of individualized EN to become an efficient standard routine procedure. It promotes a simple, fast, and safe way of providing individualized nutritional support and the quality of nutritional services would certainly benefit from its routine clinical application. It is difficult to achieve transpyloric placement of an enteral feeding tube in infant cases. We have found a new method to place an enteral diet tube for postpyloric feeding in infants. The patient will be intubated. A 5.5 mm ID nasoenteric feeding tube, a drawn scale on the surface, with an inner wire stylet is inserted and advanced through the esophagus into the gastric lumen. Then, a specially made 3.0-4.0 mm ID intratracheal tube with a cuff is connected to a Bodai connector, used for the fiberscope, whose proximal end is sealed except for a small hole to insufflate oxygen. After placing this composed tube into the esophagus transorally, a 3.0 mm OD fiberscope will be inserted into this composed tube and the distal end will be lead into stomach. The cuff is inflated in order to fix a position to prevent a leakage of air insufflated into the gastric lumen. After inflation of the gastric cavity by air, the distal orifice of the enteral feeding tube is advanced distally into the duodenum under direct view by the fiberscope. The distance from antrum can be determined by the scale on the tube. The method was tried for four patients and all trials succeeded without complications, and continuous postpyloric enteral feeding was safely begun immediately. Aim and methods The present study was designed to investigate the role of enteral nutrition on the postoperative sepsis-induced NO production, as an index of oxidative stress. Five groups of 10 male Wistar rats were subjected to midline laparotomy and feeding gastrostomy. Ten rats were allowed to recover from operation stress for 10 days and served as controls. The remaining 40 rats were allocated to receive through gastrostomy either enteral feeding (Fresubin-HP Energy, Fresenius-Kabi, 2 ml/hour, 75 kcal/day) or water for 24 hours, after which they were subdivided into two other groups by intraperitoneal injection of 10 mg/kg Escherichia coli lipopolysacchardie (LPS) (Difco 0111:B4; Sigma Chemicals) or placebo. Two hours later all rats were sacrificed, having first been subjected to blood and liver tissue sampling. The NO production was quantified by measuring the total nitrite plus nitrate concentration in serum samples and in liver tissue homogenates, by means of a spectrophotometric method that uses a modification of the Griess reaction. NO synthase mRNA expression was examined in the homogenate of liver tissue in RNAzol by RT-PCR. Results A basal production of NO was found in the serum of control rats. The operation itself was found to induce a significant increase (P < 0.01) of NO levels in the serum and the injection of LPS further induced NO levels (P < 0.001). Enteral feeding was found to significantly decrease (P < 0.01) NO levels in both groups. In contrast, NO in liver homogenates was found significantly increased (P < 0.05) in enteral nutrition plus LPStreated rats compared with placebo feeding plus LPS. Interestingly, LPS was found to induce inducible NO synthase (iNOS) mRNA expression in liver tissue, regardless of the enteral feeding, while liver from placebo (no LPS) treated animals did not express iNOS mRNA. These findings indicate that early enteral feeding leads to a reduction of the circulating NO levels induced by operation and sepsis, but increases hepatic NO levels, probably via the effect of the LPS-induced iNOS on the increased L-arginine uptake. and reduced susceptibility to peroxidation reactions compared with LCT-containing fat emulsions, which possess strong proinflammatory and oxidative damage potential [1] . The aim of this study was to compare pulmonary metabolic effects of these two fat emulsions. We prospectively measured intrapulmonary oxygen consumption (VO 2 ipulm), lung lactate production (LLP), glucose flux trough the lung (GF), energy expenditure (EE) and respiratory quotient (RQ) in 26 adult post-traumatic ARDS patients, receiving conventional fat emulsion. At study entry patients were randomly switched from LCT to MCT/LCT lipid emulsion. The patients in the LCT group (n = 13) continued to receive conventional 20% LCTcontaining fat emulsion, and in the MCT/LCT group (n = 13) a 20% MCT-containing fat emulsion was started. In both groups fat emulsion was infused in doses of 1 g/kg/day. VO 2 ipulm was estimated by subtracting the calculated VO 2 using the reverse Fick method from the whole body VO 2 using indirect calorimetry (Datex Ohmeda M-COVX metabolic monitor). All measurements and calculations were performed simultaneously after achieving steadystate conditions in 2-hour periods for 24 hours. The overall mortality in this setting was 64%. We found substantial increases in VO 2 ipulm as a component of the whole body VO 2 in both groups (median 29.3%; interquartile range 26.2-34.1%). In the MCT/LCT group there was an abrupt (mean value 26 ± 11 min) and sustained significant increase in EE (mean 3.4 ± 0.3 kcal/kg) with a maximal corresponding increase in whole body VO 2 on the 6th hour. VO 2 ipulm showed a constant decline, reaching minimum median values of 20.4% (interquartile range 17.1-22.4%) of the whole body VO 2 on the 8th hour. The absolute value of VO 2 ipulm also decreased from 2.1 ± 0.4 ml/kg/min to 1.3 ± 3 ml/kg/min. These changes were not accompanied by significant variations in LLP, GF and RQ. We concluded that the observed decrease in VO 2 ipulm in the MCT/LCT group with concomitant increases in whole body VO 2 and EE without any significant changes in RQ, LLP and GF is indicative for increased oxygen consumption for metabolic purposes outside the lung. These data suggest that infusion of a MCT-containing fat emulsion could lead to a significant decrease of oxygen used by the lung for nonmetabolic purposes (i.e. free radical formation from lipid peroxidation). The observed positive short-term metabolic effects of MCT/LCT-containing fat emulsions could have potential clinical applications in reducing oxidative damage, caused by a large inflammatory mass of the lung in ARDS patients. Recognizing the ischemic effects of sympathomimetic toxins on various vascular beds, we hypothesized that renal insufficiency in the setting of cocaine and amphetamine-related rhabdomolysis is associated with direct ischemic injury to renal tubules that is independent of the extent of muscle damage and volume depletion. Methods This is a retrospective study of consecutive patients with a diagnosis of rhabdomyolysis seen in our emergency department over 44 months. Data included blood urea nitrogen (BUN), white blood cell count, urine toxicology screen and hematocrit. The change in a second posthydration hematocrit was used to approximate the degree of initial volume depletion. Creatine kinase (CK) and creatinine levels were followed serially. Patients with obvious renal failure on presentation, defined as a first creatinine equal to 4 mg/dl (354 µm/l) were excluded. Statistical analysis utilized the two-tailed Student t test for continuous variables and the Fisher Exact test for categorical variables. Results See Table 1 . Two groups were identified, 79 patients who tested positive for cocaine or amphetamine and 52 who tested negative. Nine of 79 patients in the 'tox' group and 2/52 patients in the 'no tox' group had obvious renal failure and were excluded (P = 0.12). Conclusion Despite lower CK values suggesting a smaller amount of myoglobin delivery to the kidneys, and despite any demonstrable difference in markers of volume depletion such as BUN and percentage of dehydration, patients in the 'tox' group still had higher admission creatinine levels. Our findings suggest that vasoactive sympathomimetic drugs of abuse have an ischemic effect on the kidneys that is independent of the effects of myoglobin deposition and volume depletion. poisoning. An appropriate diagnostic approach is crucial to assess carbon monoxide (CO) cardiac damage [1] . QT dispersion (QTd) is a measure of inhomogeneous repolarization and is used as an indicator of arrhythmogenicity. Objectives The aim of the present prospective study was to evaluate the relationship between the patients' age and QT dispersion of the surface ECG in carbon monoxide poisoning. Methods Carbon monoxide intoxication was confirmed in 40 patients by arterial blood gas analysis. Patients were subdivided into two groups according to age < 35 years (Group I) or age > 35 years (Group II). QT dispersion was measured by surface electrocardiogram and Bazzet's formula was used to correct the QTd for heart rate (QTcd). Measurements of QT intervals were calculated at admission and in 24-hour and 72-hour ECGs (after admission). There were no significant differences between Group I and Group II with regard to gender and carboxyhemoglobin levels. On admission, the QTd, QTc, and QTcd intervals in Group I were significantly increased compared with Group II, but not the QT interval. There were no significant differences in QT interval measurements between Group I and Group II 72 hours after admission. Conclusion Although QT dispersion increased in patients with CO poisoning, age-related increases in QTd in the absence of QT interval prolongation may address this group as high risk. Complications after AAOD: vomiting (more than two times per day) in 10 patients (15%), diarrhoea (more than three times per day) in three patients (4%). No aspiration pneumonia or renal failure was seen. One patient was admitted to a hospital for 2 days to prevent dehydration. (3) The results for abstinence, employment, training and education, and criminal behaviour are shown in Table 1 . Conclusion (1) Although the majority of patients in this group made use of methadone programmes, they still often suffered from health problems common to intravenous drug users such as being underweight, respiratory problems and hepatitis B and hepatitis C. No HIV-positive patients were found. (2) Complications after AAOD were infrequent and not severe. (3) AAOD and naltrexone combined with CBT as provided by our clinic leads to opiate abstinence in 70% of patients after 1 year and to a significantly higher participation in employment, training and education, and a significant decrease in criminal activities. The ingestion of caustic substance can produce severe injury not only to the esophagus, but also to the gastrointestinal tract and can even result in death. The degree and extent of damage depends on several factors like the type of substance, the quantity, and the intent; the quantity is a prognostic value: 20-50 ml = severe, > 50 ml = very severe. In the acute phase, perforation and necrosis may occur often with devastating consequences on the esophagus and the stomach; the injuries are associated with high mortality and morbidity rates when mediastinitis, gastrobronchial fistula,chemical peritonitis, or perforation of the gastrointestinal tract occurs. Perforation may occur in the late phase, especially with the ingestion of alkaline solution, because the inflammation invades more deeply with the continuous release of OHafter coming into contact with protein.Long-term complications include stricture formation in the esophagus, antral stenosis and the development of esophageal carcinoma. The aim of this study is to evaluate whether early diagnosis and surgical treatment are essential to improve the prognosis. Methods From November 2000 to November 2003 six patients.were admitted to our department, mean age 38 ± 10 years, with a 1:1 male to female ratio. The average time between the caustic ingestion and admission to the emergency ward was 5 ± 2 hours. The ingested substances were alkali in 80.9% and acid in 19.1% of the cases.The blood gas examination, endoscopy, computed tomography, and chest-abdominal X-ray were performed. They requested an orotracheal intubation and ventilatory support; antibiotic therapy was early initiated with metronidazole and tazobactan-piperacilline and an optimal hemodynamic stabilization was achieved. A combined surgical procedure was performed: in the acute phase, laparoscopic esophageal-total gastrectomy, open cervicotomy, cervicostomy, percutaneous enterostomy for enteral nutrition, while the gastricesophageal reconstruction was performed subsequently. Blood gas examination showed the mean pH was 7.22 ± 0.14 and the mean base excess was -10.0 ± 6.5. Endoscopy revealed multiple deep brownish-black ulcers (four patients) and perforation (two patients). Chest X-ray revealed air bubbles close to inferior the IIIrd part of the oesophagus while abdominal X-ray revealed no perforation. The ICU stay was 20 ± 7 days. Complications included: four cases of postoperative complications, including in the first surgical procedure break-up of enterostomy (n = 1) and in the reconstruction phase anastomosis pseudo-diverticula (n = 1), and anastomosis leakage (n = 2 died). The surgical mortality was 33%. All patients tolerated oral intake well after surgery; a high-protein and hypercaloric diet seemed to be beneficial for patients. Early laparoscopic surgical treatment improved the prognosis in these severe cases. Methods This study evaluated the predefined subgroup of 226 patients with a history of pulmonary disease included in a prospective randomized controlled trial of BNP testing for the emergency diagnosis of acute dyspnea. Patients were randomly assigned to a diagnostic strategy with (n = 119, BNP group) or without (n = 107, clinical group) the use of BNP levels provided by a rapid bedside assay. The time to discharge and the total cost of treatment were recorded as the primary endpoints. Results Baseline demographic and clinical characteristics were well matched between groups. Comorbidity was extensive, including coronary artery disease and hypertension in one-half of the patients. The primary discharge diagnosis was CHF and exacerbated obstructive pulmonary disease in 39% and 33%, respectively. The use of BNP levels significantly reduced the need for hospital admission (81% vs 91%, P = 0.034). The median time to discharge was 9.0 days in the BNP group as compared with 12.0 days (P = 0.001) in the clinical group. The total cost of treatment was $5764 (95% confidence interval, 4450-7078) in the BNP group as compared with $7665 (95% confidence interval, 6448-8882; P = 0.038) in the clinical group (Fig. 1) . Inhospital mortality was 8% in both groups. Conclusion Used in conjunction with other clinical information, rapid measurement of BNP reduced the time to discharge and total treatment cost of patients with a history of pulmonary disease presenting with acute dyspnea. Introduction Prehospital rapid sequence intubation (RSI) is an intervention utilized for airway management in noncardiac arrested patients. We report patient outcomes following prehospital RSI utilized for medical emergencies. Hypothesis Airway management with prehospital RSI for medical emergencies improves patient outcomes. Methods From 1990 to 1999, all 9-1-1 incidents related to childhood (ages 0-14 years) drowning events were examined prospectively in a metropolitan US sunbelt city (population 2 million) using a comprehensive Utstein-style database. The at-risk population (ages 0-14 years) averaged 418,000 during the study. In the finite population studied, two-thirds of all (adult and child) submersion incidents involved children, totaling 420 cases (mean = 42 children/year; annual incidence = 10.0 per 100,000), with 72% (n = 303) occurring in those aged 5 years or younger (20.2 per 100,000/year). In certain years, this younger cohort comprised as many as 87% of cases. Most cases (65%) occurred in summer and 83% between 12:00 and 8:00 pm (none 12:00-7:00 am). The site was a pool in 75% of cases (n = 317) with 64% of these at apartments. Only 19% involved tubs/spas (annual range = 6-34%) and 5% were in buckets, toilets, bayous, lakes, and creeks. Of the 420 total cases, one child was found dead on-scene and 234 clearly required resuscitative efforts (using strict criteria). Bystanders performed CPR in 82% of these resuscitation cases (n = 193) and 72% of these children survived long-term (99% neurologically intact). However, if a child remained apneic/pulseless by the time emergency services arrived (average response = 5 min), less than 5% were revived (none neurologically intact). Of 94 total deaths, two-thirds occurred in pools. In certain venues, submersion incidents can account for a large number of per-capita childhood deaths, and immediate basic bystander CPR, not advanced life support, is the most definitive resuscitative action for children with drowning-related incidents. Considering that most drowning incidents in this study occurred in residential pools and in those aged 5 years or younger, supervision, safety barriers and knowledge/performance of bystander CPR appear to be the major factors in prevention of childhood drowning deaths. Aim To compare the incidence of pre-arrest signs in patients suffering unexpected cardiac arrest in different types of hospitals. We reviewed the records of the patients who suffered unexpected cardiac arrest during an 18-month period in a tertiary teaching hospital, in a tertiary trauma hospital and in two secondary hospitals. Data on patient characteristics and observations and interventions during the 8 hours preceding cardiac arrest were collected. The findings were evaluated against the calling criteria of the Medical Emergency Team (MET) [1] . In the four hospitals, 110 patient records were reviewed. Fifty-six of the cardiac arrests occurred on a ward and 25 (45%) of these patients fulfilled MET calling criteria. The mean time from the first documented abnormal vital sign to the arrest was 3.8 hours (range 0.25-8.00 hours).The proportion of patients meeting MET criteria differed significantly between the tertiary teaching hospital, the two secondary hospitals and the trauma hospital (14%, 27%, 69% and 80%, respectively; chi-square P < 0.001). The most frequent criteria were respiratory distress, SpO 2 < 90% on oxygen and systolic blood pressure < 90 mmHg. Of the patients suffering cardiac arrest elsewhere than on a ward (i.e. coronary care unit), 22% fulfilled MET criteria, but these patients received immediately intensive treatment. The incidence of pre-arrest signs and thus the potential benefit of a MET, which is called when certain criteria fulfils, appears to vary considerably between hospitals of different types. In cardiopulmonary resuscitation, securing the airway is of paramount importance. Even if intubating the trachea can still be seen as the gold standard, it is still reserved for experts and healthcare professsionals. However, the insertion of a laryngeal airway device offers -compared with bag-valve face ventilationthe opportunity to ventilate a patient effectively. Moreover, it can also be placed easily by lay responders. We put forth the hypothesis that a laryngeal airway device can be placed intuitively without any background knowledge about the device. A simple but well-directed training programme can even improve the performance. The aim of the study was to investigate the intuitive use of different airway devices by first-year medical students. Methods Devices tested were the LMA-Classic and the LMA-FastTrach. Subjects embodied were 139 medical students. They were evaluated on an airway trainer regarding, mainly, the time until correct placement of the device, the number of attempts and the initial tidal volume. The trachea of the the mannequin was therefore connected with a volumeter. An initial tidal volume < 150 ml was considered insufficient and excluded from further data analysis. A second evaluation was done after a specific training programme. Twenty from 79 subjects out of the LMA-Classic group and 11 from 60 subjects out of the LMA-FastTrach group had an initial tidal volume < 150 ml. The measured tidal volume was 673.7 ± 133.1 ml for the LMA-Classic and 1057.7 ± 158.5 ml for the LMA-FastTrach. The mean time to correct placement was 55.5 ± 29.6 s for the LMA-Classic and 38.1 ± 24.9 s for the LMA-FastTrach. In the second evaluation, initial tidal volume < 150 ml was recorded in 14 out of 79 subjects for the LMA-Classic and in six out of 60 subjects for the LMA-FastTrach. The time to correct placement decreased significantly, with 22.9 ± 13.5 s for the LMA-Classic and 22.9 ± 19.0 s for the LMA-FastTrach. The measured tidal volume was 777.6 ± 367.9 ml for the LMA-Classic versus 1018.4 ± 50.7 ml for the LMA-FastTrach. Class III antiarrhythmic agent that is synthesized in Japan as a pure K channel blocker without a negative inotropic effect [1] . NIF has been the most effective antiarrhythmic drug recommended for patients with ventricular fibrillation (VF) resistant to some other antiarrhythmic drugs in Japan. However, whether NIF improves the rate of successful resuscitation after out-of-hospital cardiac arrest with shock-resistant VF has not been determined. Our preliminary study found a significant improvement in the proportion of patients surviving to the emergency department following out-of-hospital cardiac arrest in NIF-treated patients [2] . Our protocol is as follows: for out-of-hospital cardiac arrest patients with shockresistant VF or pulseless VT (after three or more precordial shocks), epinephrine (1 mg bolus) and then NIF (0.3 mg/kg bolus followed by 0.4 mg/kg/hour infusion) are administered. We showed 31% of out-of-hospital cardiac arrest patients with shockresistant VF could survive by treatment with this protocol [2] . We will show the details of the clinical course of successfully recovered patients treated with this protocol. Results AEDs were used for 18 patients. The administration of shock was advised in nine patients who had electrocardiographically documented ventricular fibrillation, and no shock was advised in the remaining patients (sensitivity and specificity of the defibrillator in identifying ventricular fibrillation, 100%), 50% of the rhythm was pulseless electrical activity (probably the 'silent dead'). The first shock successfully defibrillated the heart in nine patients (100%). The expansion of the knowledge of the AEDs and the corrected implantation of the chain of survival 'on board' will increase the extense of recovery of cardiopulmonary arrest victims in aircrafts. Intervention A random telephone survey gathered demographic information about 408 respondents who rated their level of agreement to questions concerning witnessed CPR using a fivepoint scale. The respondents ranked their level of agreement in the manner presented in Table 1 . Age, level of education, income level, perceived health status, and end-of-life planning did not correlate with the responses. Married and widowed respondents, in contradistinction to others, believe that witnessed CPR would benefit the patient (P = 0.023). Respondents desiring CPR were more prone to believe that significant others should be allowed during CPR as opposed to those not desiring CPR (P = 0.005). They are more apt to want others present with them while undergoing CPR than those declining CPR (P = 0.002). They also felt more strongly that the presence of significant others during CPR would benefit the patient (P = 0.018) and family or friends Table 1 "I believe family members or friends have the right 36.5% (n = 149) strongly agree, 10.3% (n = 42) agree, 22.4% (n = 90) neither agree nor to be present in the room while a loved one is disagree, 98.8% (n = 37) disagree and 20.6% (n = 84) strongly disagree with this undergoing CPR" statement. Five respondents answered "I don't know" "I would want to be in the room with a loved one 37.3% (n = 152) strongly agree, 12.0% (n = 49) agree, 12.3% (n = 50) neither agree nor during CPR" disagree, 8.8% (n = 36) disagree, 28.2% (n = 115) strongly disagree with this statement, and 1.2% (n = 5) don't know "I would want family/friends with me if I were 29.9% (n = 122) strongly agree, 12.3% (n = 50) agree, 17.2% (n = 70) neither agree nor undergoing CPR" disagree, 8.6% (n = 35) disagree, 29.4% (n = 120) strongly disagree with this statement, and 2.5% (10) don't know "The presence of family/friends during CPR 24.3% (n = 99) strongly agree, 14.2% (n = 58) agree, 19.9% (n = 81) neither agree nor would benefit the patient" disagree, 13.75% (n = 56) disagree 22.3% (n = 91) strongly disagree with this statement, and 5.4% (n = 22) don't know "The presence of family/friends during CPR 23.8% (n = 97) strongly agree, 13.5% (n = 55) agree, 23.5% (n = 96) neither agree nor would benefit the family/friends" disagree, 12.5% (n = 51) disagree, 24.9% (n = 99) strongly disagree with this statement, and 2.2% (n = 9) don't know (P = 0.022). The desire be present in the room with a loved during CPR did not reach statistical significance (P = 0.078) between the two groups. Conclusion A large segment of the public desires witnessed CPR, and believes it to be beneficial. Age, level of education, income level, and end-of-life planning do not appear to influence these beliefs. Married and widowed respondents were more apt to consider witnessed resuscitation of benefit to the patient. People desiring CPR are more likely to have positive feeling about witnessed CPR. Those in poor health, not desiring CPR, are more pessimistic about witnessed resuscitation. Although healthcare providers have mixed sentiments, it would be wise to develop protocols to accommodate those who wish to remain together during CPR. Healthier individuals, married people and their families, and those widowed will be more demanding in this matter. The more infirm, not desirous of CPR, will be less demanding and less inclined to avail themselves of such formal programs. The study took place in a high-fidelity patient simulator. A scenario of a witnessed cardiac arrest due to ventricular fibrillation was used that occurred after a uneventful period of 2 min. Twentyfour teams, each consisting of three physicians, were randomly assigned to one of two versions of the scenario: version R (reality) mimics reality in that the arrest occurs in the presence of one physician and the remaining two physicians are summoned to help; in version A (all present) all three physicians were present at the moment of the arrest. The performance of the teams was rated using videotapes recorded during simulations. The first meaningful measure (FMM) was defined as either precordial thump, ventilation, cardiac massage, or defibrillation. Completion of the initial phase (CIP) was defined as three countershocks, initiation of mask ventilation and cardiac massage, and injection of epinephrine. Both groups differed significantly in the timing of important measures (Table 1) . The present study was designed to unmask the additional burden of teambuilding during the very early phase of an emergency situation. Teams that were able to form prior to the cardiac arrest performed significantly better than teams that had to form during the cardiac arrest. Thus, the process of teambuilding is associated with a significant delay in crucial measures in cardiopulmonary resuscitation. Further research is necessary on how to improve teambuilding in the settings of medical emergencies. Methods A doctor and a nurse who are resuscitation instructors conduct the surprise drills using a computerized simulation mannequin (SIM 4000). The instruction team arrives at the department unannounced, presents a clinical scenario, observes the way in which the department functions during the resuscitation process, and documents its findings. At the end of the exercise, a discussion is held with the department staff concerning the quality of resuscitation implementation. Protocols for treatment of potentially fatal arrhythmias are also reviewed during this session. After the exercise, the department head receives a written report of the quality of the CPR exercise, and recommendations for improving CPR management. Implementing surprise drills supplements the CPR training through which the unified language has been introduced into the hospital. Backgrounds and goals Mild induced hypothermia (MIH) has become a standard in neuroprotective treatment for anoxic brain injury after cardiac arrest. Different surface cooling protocols have been successfully used [1] . We implemented MIH by use of icewater-soaked towels over the torso and legs in combination with sedation/muscle paralysis to avoid shivering after prehospital cardiac arrest in patients, age 15-80 years, with persistent coma and lack of cardiogenic shock. The target temperature (33 ± 1°C) was maintained for 12-24 hours. We evaluated the feasibility of our protocol. Fig. 1 . The mean time to reach the target temperature was 4.5 hours (0-12 hours). MIH was maintained for a median 13 hours (4-26 hours). The rewarming period to 37°C took a median of 8.75 hours (3-21.5 hours). MIH treatment was followed by fever (> 37.9°C) in 23 patients (88%). Conclusions MIH by use of our external cooling protocol is feasible, simple and inexpensive. However, surface cooling is tardy, imprecise and in some patients unsuccessful. Background and goals Mild induced hypothermia (MIH) improves neurological recovery and survival after cardiac arrest for patients in whom the initial rhythm is ventricular fibrillation (VF) [1] . Other initial rhythms and cardiac arrest due to noncoronary causes may also benefit from such treatment [2] . We evaluated 10 patients who received MIH after non-VF-(asystole, pulsless electric activity) out-ofhospital cardiac arrest (NVF-OHCA) and compared them with a historic control (n = 9) of NVF-OHCA who as not treated with MIH. The ICU and hospital length of stays as well as the incidence of bad outcome were compared. Bad outcome was defined as cerebral performance category (CPC) ≥ 3 or death [3] . Results Data are presented in Table 1 . The tendency towards prolongation of length of stay at even poor outcome suggests that a liberal MIH inclusion policy may result in a dissatisfactory cost-benefit ratio. Further research and larger patient numbers are needed to verify these results. Results Neurological complications were seen in 10 out of 31 patients (32.3%). These complications were as follows: new onset, recurrent headache (three patients), generalized seizures (two patients), persistent tremor (one patient), central pontine myelinolysis (one patient), dysartria (one patient), myopathy (one patient) and mutism (one patient). The seizures encountered in the two patients were thought to be associated with the toxic effect of tacrolimus, an immune-suppressant drug, because the seizures ceased after cessation of the drug. Myopathy presenting as quadriplegia was seen in one patient in the very early postoperative period and was diagnosed by EMG and muscle biopsy. It leaded to prolonged mechanical ventilation, ICU stay and hospitalization. The patient with central pontine myelinolysis lived in a persistent vegetative state for 2 years. Conclusion Neurological complications were observed in 10 out of 31 patients who underwent liver transplantation. Some of these complications were associated with the use of immunesuppressant drugs. In conclusion, neurological complications are frequently encountered after liver transplantation and are a cause of severe morbidity and prolonged ICU and hospital stay. Objective We showed the course of the concentration of serum S-100B in the acute phase of head injury using the YK-150. Patients and methods S-100B serum levels were determined in 10 patients (eight men, two women; mean ± SD, 50.1 ± 20.2 years). There were two cases of severe head injuries (Glasgow Coma Scale [GCS] < 9). Blood samples were taken on admission, 24 and 48 hours after the traumas. Serum S-100B protein concentrations (pg/ml) were measured by ELISA (Yanaihara Industry, Tokyo, Japan). Results Initial serum S-100B concentrations were elevated (minimum, 790 pg/ml; maximum, 7,749,669 pg/ml; mean, 979,666 pg/ml). All patients whose serum S-100B concentrations compared with the first-time value decreased at the second point, 24 hours after injury (minimum, 10.1 pg/ml; maximum, 16,990 pg/ml; mean, 5994 pg/ml). After 48 hours, only two patients showed an increase of serum S-100B concentrations and one of these showed the highest level of serum S-100B and died on day 28 (Fig. 1 ). Many studies have been done on S-100B that have shown the relation between initial data and poor prognosis. We have also shown patients with slight head injuries who were conscious (GCS > 8) and whose elevated serum S-100B concentrations decreased in the next 24 hours. We suspect it was only the cerebral cell damage that caused the initial increase of serum S-100B concentrations in these head injuries. If there is no secondary brain damage, serum S-100B concentrations will immediately decrease. The YK-150 (Human S-100B ELISA kit) can measure serum S-100B concentrations in 22 ± 4 hours. Using the YK-150, if we can detect a slight variation in early-phase secondary brain damage, we can accurately predict what changes will take place in the patient; and if so, YK-150's efficacy will spread even further. Background Cerebrospinal fluid concentrations of S-100B protein, an acidic calcium-binding protein found in astrocytes and Schwann cells, increase after central nervous system damage. Serum S-100B protein thus has potential as a biomedical marker of brain cell damage. Several reports show a relation between the severity of head injury and serum S-100B protein (S-100B) levels in trauma patients, but there are few data about S-100B in endogenous cerebral disease. Objective The aim of this study is to evaluate S-100B as a marker in endogenous encephalopathy. Methods Serum S-100B protein concentrations (pg/ml) were measured daily by ELISA until ICU discharge in 19 ICU patients (12 men, seven women; age 9-80 years [mean 57.1 ± 22.8 years]) with endogenous encephalopathy. The APACHE II score and Glasgow coma scale (GCS) were used to assess the severity; electroencephalography (EEG) and computerised tomography (CT) were also examined. Values are expressed as mean ± SD. The unpaired Student's t test, or tests of Mann-Whitney's U, Wilcoxon signed-rank, Kruskal-Wallis and Pearson's correlation coefficient were used. P < 0.05 was considered statistically significant. Results There were 10 survivors and nine nonsurvivors with no significant differences in age, APACHE II score or GCS. There was no significant difference in S-100B levels on admission between survivors and nonsurvivors, but S-100B levels were significantly lower in survivors than in nonsurvivors from day 1 (1129 ± 1780 vs 465,370 ± 780,293, P < 0.05) until ICU discharge (16.5 ± 15.9 vs 231,120 ± 591,110, P < 0.05). In survivors, S-100B levels decreased from 5 day (30.1 ± 18.5) to discharge compared with admission levels (P < 0.05); in nonsurvivors, there were no significant changes in S-100B compared with admission levels. There were no correlations of S-100B levels with APACHE II score (R = 0.3, P > 0.05) or GCS (R = -0.1, P > 0.05), but EEG and CT abnormalities were correlated with S-100B levels. Conclusion Serum S-100B concentrations follow different courses in survivors and nonsurvivors in endogenous encephalopathy. Although similar on admission, differences in serum S-100B protein between survivors and nonsurvivors appeared from the first day after admission. In survivors, but not in nonsurvivors, S-100B levels decreased until discharge. There were also significant relationships with the severity of EEG or CT abnormalities and S-100B levels. Serum S-100B protein could be a useful biomedical marker for assessment of brain damage and may predict prognosis in endogenous encephalopathy. P308 Severe brain injury epidemiology in Western Macedonia: experience of a general hospital as the basis for planning brain injury management Tables 1 and 2 . There is a need for strong preventive measures to control the high incidence of SBI in the region. The mortality rates are acceptable when compared with other reports, suggesting adequate quality of initial stabilization and transport, and competency and speed in the detection of candidates for surgical decompression. Nevertheless this system may lead to some avoidable deaths, especially in acute intracranial hemorrhage. The optimal situation would be the presence of a neurosurgeon in the medical team admitting patients with severe head trauma and the concentration of injury services since no hospital receives sufficient patients to develop and maintain expertise. Available online http://ccforum.com/supplements/8/S1 Severe brain trauma leads to a marked impairment of the patient's autonomy. Public health measures are necessary to organize an early multidisciplinary approach that could improve the patient's rehabilitation and reintegration in the social network. Fifty-nine patients with severe head injury (Glasgow Coma Scale < 8) were included in our study. From these 59 patients, 49 were males and 10 females. Their mean age was 36.8 years, with a range from 12 to 71 years. In all patients, we performed and compared concomitant intracranial pressure (ICP) measurement with an intracranial catheter and a TCD examination. In each TCD examination we measured the maximum, mean and the end diastolic velocity (Vmax, Vmean and Vmin, respectively), and we calculated the pulsatility index (PI). The mean cerebral artery through the temporal acoustic window was used for the TCD examinations. A total of 120 TCD examinations and ICP measurements were recorded. We performed correlations between Vmax, Vmean, Vmin and PI with ICP and cerebral perfusion pressure (CPP). There was no statistically significant linear correlation between TCD velocities and ICP and between the PI index and ICP. There was a significant correlation between Vmax and CPP and between Vmean and CPP. The best correlation found was between Vmin and CPP and between the PI index and CPP. Linear correlation and regression between TCD findings and ICP and CPP are presented in Table 1 . Conclusions TCD examination cannot be used as reliable noninvasive method to determine a concrete number of ICP. The PI index is more reliable than the TCD blood flow velocities to target therapeutic strategies in patients with severe head injury, but we must keep in mind that the PI estimates the changes of CPP over time and not an absolute value of CPP. Introduction Cerebral blood flow (CBF) is reduced around areas of contused brain after head injury [1] . However, since the cerebral metabolism is also reduced this may represent appropriate flowmetabolism coupling, rather than ischaemia. Matching of CBF to metabolism is quantified as the oxygen extraction fraction (OEF). We performed magnetic resonance imaging (MRI) and Oxygen 15 positron emission tomography (O-PET) to quantify CBF and OEF in regions of pericontusional oedema after head injury. Methods After local ethical approval, five patients with severe head injury underwent structural MRI and O-PET, in the first week after injury. The two studies were performed in immediate succession, with every effort made to maintain stable physiology. The fluid attenuation inversion recovery (FLAIR) MRI sequence provides cerebrospinal fluid-nulled, T 2 -weighted MRI images, on which oedema is hyperintense. The FLAIR images were coregistered and voxel resized to O-PET-derived maps of CBF and OEF, using a published methodology [2] . Each patient's coregistered FLAIR image was inspected for the largest and most apparent region of pericontusional oedema. These regions were then manually outlined and applied to PET maps, and CBF and OEF in these regions of oedema were calculated. Results were compared with unit reference data obtained from healthy volunteers [3] . Results Normal values of CBF and OEF obtained from volunteer datasets for mixed grey-white regions were 35 ± 5 ml/100 g/min and 45 ± 5%, respectively. While pericontusional regions showed a significantly lower CBF (20.5 ± 8.3 ml/100 g/min), we observed a wide range of values both across subjects and within individual lesions (14.7-35 and 5.1-52.3 ml/100 g/min, respectively). However, mean OEF values were low (35.4 ± 2.1%), with a much smaller range of values across patients (31.8-37.5%) and individual image voxels (24.9-41.8%). The 95% confidence intervals for the population mean pericontusional CBF and OEF were 10.2 and 30.8 ml/100 g/min and 32.8 and 38.1%, respectively. The wide range of CBF values should make clinicians wary of predicting the viability of tissue in regions of pericontusional oedema. The values are all above the threshold, of 8.4 ml/100 g/min [4] , for irreversible ischemia from the stroke literature. Thresholds for irreversible ischemia after head injury are unknown. The upper 95% confidence interval for the OEF is only 38% and the maximum OEF in any voxel was 42%, suggesting that the reduction in CBF may be appropriate to the metabolic demands of the tissue. A To determine whether there is a relationship between cerebral perfusion pressure (CPP), intracranial pressure (ICP) and survival in children with severe traumatic brain injury (sTBI) in the first 6 and 24 hours after ICP monitoring. We retrospectively reviewed the case notes of all children under the age of 16 years who were admitted to the Paediatric Intensive Care Unit in Southampton General Hospital following a head injury in whom intracranial pressure monitoring was undertaken over a 4-year period. ICP, CPP and mean arterial pressure were evaluated hourly and means were calculated for the first 6 and 24 hours after ICP monitoring. The primary outcome measure was survival and children were categorised into three groups; Group 1, overall functioning within the normal range; Group 2, survival but with some neurological impairment; and Group 3, died as a result of their injuries. Of 102 children admitted to the unit during the period of the study, following a head injury, 59 had intracranial monitors placed within the first 24 hours. Nearly two-thirds (64.7%) were male and all 59 suffered sTBI with mean (SD) admitting Glasgow Coma Scores of 8 (3). The crude mortality rate was 10.2%. When comparing the mean ICP over the first 6 hours we found a significant difference between all three groups. The mean ICPs (SD) at 6 hours were as follows; Group 1, 10.61 mmHg (5.43); Group 2, 18.57 mmHg (7.34) and Group 3, 42.88 mmHg (23.64) The mean (95% confidence interval [CI]) ICP was 7.96 mmHg (CI 1.05-14.87) lower in Group 1 than Group 2 (P < 0.05), and 24.31 mmHg (CI 13.90-34.71) lower in Group 2 than Group 3 (P < 0.05). This difference between groups was maintained at 24 hours. Mean ICPs (SD) at 24 hours; Group 1, 12.61 mmHg (5.12); Group 2, 20.35 mmHg (5.07) and Group 3, 44.69 mmHg (25.70). The mean (95% CI) ICP was 7.74 mmHg (CI 1.3-33.16) lower in Group 1 than in Group 2 (P < 0.05), and 24.34 mmHg (CI 14.31-34.35) lower in Group 2 than Group 3 (P < 0.05) at this time. There was also a significant difference in the CPP between Groups 1 and 3, and Groups 2 and 3 at both 6 and 24 hours, although no significant difference was detected between Groups 1 and 2. This study suggests that in children a low ICP, in the first few hours after a sTBI, is crucial to intact survival and helps to differentiate this from a poor outcome (neurological impairment or death). CPP did not help stratify those patients who survived (Group 1 and Group 2). The question remains whether we can improve outcome with aggressive measures to reduce the ICP or increase the CPP. The Wessex Head Injury Matrix (WHIM) [1] is one of the rare behavioural scales that have been designed to follow the recovery of head-injured patients throughout the whole spectrum of altered states of consciousness, from exit of coma to complete recovery. In this study, we explored the validity of the WHIM in relation to other behavioural assessment tools -Glasgow-Liège Coma Scale (GLS) [2] , Coma-Near Coma scale (CNC) [3] , Western Neuro Sensory Stimulation Profile (WNSSP) [4] -as well as in relation to the Bi-Spectral Index (BIS), derived from electroencephalographic measures. Twenty-nine brain-injured comatose patients (aged 21-83 years) were followed longitudinally with these behavioural and electrophysiological measures. Overall, the evolution of the scores on the WHIM correlated significantly with the evolution of scores obtained by the GLS (r = 0.88; P < 0.01), the CNC (r = -0.8; P < 0.01), the WNSSP (r = 0.87; P < 0.01), as well as with the BIS measure (r = 0.58; P < 0.01). Relative to the GLS, CNC and WNSSP, the WHIM showed a particularly good sensitivity for documenting subtle changes in recovery for patients in a minimally conscious state. The BIS index globally evolved in parallel to the behavioural scales. However, it showed a very bad sensitivity as many patients in a coma or a vegetative state presented BIS scores that were as high as those observed for patients that had regained normal consciousness. The results confirm the usefulness of the WHIM, especially for the assessment of minimally conscious patients. However, even if a global relation is observed with behavioural scales, the validity of electrophysiological measures such as the BIS index is unsatisfactory for the assessment of altered states of consciousness. P314 Neuropsychological testing in the locked-in syndrome: preliminary results from a feasability study The locked-in syndrome (LIS) is characterised by complete loss of voluntary motor output but preserved sensory input and consciousness, as a result of a ventral pontine lesion. Communication is only possible via spared vertical eye movements and/or eyelid blinking. The aim of this study was to adapt standard neuropsychological tests to an eye-response mode for use in LIS patients. We assessed five patients in LIS for 3-6 years (age 24-57 years) and 10 controls with a modified version of the direct and backwards digit span (working memory), the Doors and People Test (episodic memory [1] ), the Wisconsin Card Sorting Test (executive functioning [2] ), the LEXIS (phonological and lexico-semantic processing [3] ), the EVIP (vocabulary knowledge [4] ) and two new tests designed to measure sustained and selective attention for auditory stimuli. Like the LIS patients, the control subjects had to respond via eye movements. The results showed that the patients' performance was in the normal range for most measures. However, differences of performance between subjects have been found. This study demonstrates the feasibility of a complete neuropsychological testing in chronic LIS survivors. It re-emphasises the fact that LIS patients recover a globally intact cognitive potential. Nevertheless, inter-individual differences observed suggest the interest of this battery to detect some deficits and then to maximize the communication between the family, the medical staff and the patients in LIS. Thirty-five NICU patients (age 48.9 ± 19.5 years, male 64%, median Glasgow Coma Scale motor 5) (traumatic brain injury 63%, aneurismal subarachnoid hemorrhage 20%, postoperative 17%) were studied. Infection was clinically suspected in all patients and in 54% of them its was confirmed by microbiological data. We analysed a total of 177 SC doses at a mean 185 ± 89 hours after ICU admission with a median of five administrations per patient. The mean dose was 13.7 ± 6 mg (0.17 ± 0.04 mg/kg). T° decreased significantly after DCF SC, from 38.4 ± 0.4 to 37.6 ± 0.5°C (P < 0.0001), as did the ICP, from 16 ± 8 to 12.8 ± 6 mmHg (P = 0.0002). PaCO 2 and SjvO 2 were not different pre and post DCF. The CPP was stable after DCF (pre, 71 ± 15 mmHg; post, 69 ± 15 mmHg [NS]). The HR significantly dropped (from 97 ± 21 to 89 ± 20 beats/min, P < 0.0001). Blood gas analysis, renal and hepatic parameters were not different after DCF SC. Diuresis was maintained and it did not decrease much (from 175 ± 97 to142 ± 102 ml/hour, P < 0.05). The effects of DCF on T° and ICP are shown in Fig. 1 . The effects of DCF on MAP, CPP and hourly urine output are shown in Fig. 2 . We conclude that DCF SC at low dosage was advantageous and effective. It enabled good reductions in body T°w ith an associated reduction in the ICP. Side effects on CPP or MAP were minimal, and renal and hepatic functions were not affected. Available online http://ccforum.com/supplements/8/S1 Background A wide variety of drugs are currently being tested for the management of acute ischemic stroke. Glycoprotein IIb/IIIa inhibitors and low molecular weight heparins show great therapeutic potential in the management of acute ischemic stroke. This trial tested the effects of enoxaparin, tirofiban and the combination of both over infarct volume, short-term clinical outcome, and the risk of intracerebral hemorrhage (ICH) in an animal experimental model. An autologous fibrin-rich clot was introduced in the common carotid artery. Two hours after embolization, placebo, enoxaparin, tirofiban, or enoxaparin/tirofiban was infused intravenously over 30 min. The neurologic deficit was evaluated using a neuroscore 24 and 48 hours after the stroke. The incidences of ICH and infarct volume were evaluated through histologic analysis. The infarct volume was larger in the placebo group than in the enoxaparin, tirofiban and enoxaparin/tirofiban groups. The neuroscore at 24 and 48 hours was higher for the group receiving both drugs simultaneously. No ICH was present in any of the groups (Table 1) . Conclusion It is probable that besides the anticoagulant and antiaggregant effects of enoxaparin and tirofiban, other intrinsic effects of each drug have an impact on the infarct's volume. Combining both drugs resulted in a synergic effect, probably due to prevention of reperfusion damage. A slight improvement of the short-term clinical outcome could be seen with the drugs, meaning that with less tissue damage, better long-term clinical improvement is expected. In this trial, none of the drugs increased the risk of ICH. Further investigation must be done to define the right therapeutic window of opportunity for both of these agents. Introduction Intractable status epilepticus not responding to conventional pharmacotherapy is a medical emergency. Deeper suppression of cortical activity, documented electrocerebral silence and titrable length of time during which such electrocerebral silence can be maintained all make thiopentone the ideal drug for initial management of status epilepticus [1, 2] . A longlasting anti-epileptic drug regimen can be established during thiopentone-induced burst suppression, which can then be tapered and discontinued with minimal chances of recurrence. Methods Twenty-one pediatric patients suffering from idiopathic generalized tonic clonic disorder with age of onset of 12 months-2 years were included in the study. All patients were admitted to the pediatric ICU after initial management in the emergency room. Rapid sequence intubation with cricoid pressure was done with an induction dose of 4 mg/kg thiopentone intravenously (IV) and 1.5 mg/kg succinylcholine IV. Additional boluses of 4 mg/kg thiopentone were administered at 5-30 min intervals until complete areflexia was achieved, while a continuous intravenous thiopentone infusion (0.5-4 mg/kg/hour) was maintained and titrated to obtain electroencephalogram (EEG) burst suppression. Continuous single channel processed EEGs using a cerebral function analyzing monitor and an intermittent multichannel electroencephalogram (EEG) were done. After control of clinical and electroencephalographic seizure activity, patients were started on phenytoin and phenobarbitone. Thiopentone infusion was progressively tapered over 24 hours and finally discontinued once therapeutic serum levels were achieved for these anti-epileptic medications. Statistical values were obtained from 2 × 2 tables of outcome versus EEG suppression employing Fisher's exact test with significance quantified as P < 0.05. Out of the 21 patients, 16 showed burst suppression and five showed a 'flat' record. Two patients in the burst suppression category showed recurrence of seizure activity after being controlled initially, and none in the flat one. In these two patients, EEG seizures recurred earlier than clinical seizures, which were rapidly controlled with increasing the rate of thiopentone infusion. More sustained control of seizure activity was achieved by adding valproic acid to the anti-epileptic regimen in these two patients. We conclude that, tightly controlled by serum levels, carefully monitored with EEG for therapeutic efficiency, initiating and tapering of thiopentone infusion in the ICU setting with mechanical ventilation and hemodynamic monitoring will allow the physician to establish therapeutic serum levels of conventional antiepileptic agents, to reduce the relapse rate, and to avoid the Introduction Various models are used to predict mortality of patients admitted to the intensive care unit (ICU) based on the first ICU day findings. Some of these models have developed daily scoring systems for the subsequent ICU days. There is no consensus regarding the definition of futility and there are no reliable ways of identifying patients for whom ICU care is futile. The purpose of this retrospective study was to determine whether we can identify factors associated with futility in the sickest patients admitted to the ICU. We hypothesized that any increase in the acute physiology score (APS) from the first ICU day to the third ICU day would identify very sick patients with futile care. We defined the sickest critically ill patients as those with a first day APACHE III predicted mortality rate of 80% or higher. Among 43,605 ICU admissions entered in the APACHE III database from 1994 to 2002, 15,512 stayed in the ICU for 3 or more days and 748 (1.7%) had a first day predicted mortality rate of 80% or higher. Only 308 of the 748 patients (41%) survived 3 days in the ICU. Excluding six patients who did not authorize their medical records to be reviewed for research, 302 admissions were included in the study. Demographics, first and third day APSs and probability of hospital death, and date of death were obtained. The patients were predominantly (92%) Caucasians; 54% were male. The first and third ICU day mean APSs were 106.8 and 70.5, respectively, and predicted mortality rates were 87.8% and 86.5%, respectively. The observed hospital mortality rate was 61.3%. There was an increase in APS on the third ICU day, compared with the first ICU day, in 34 patients (11.3%). Only two of the 34 patients (6%) with increased APS survived to hospital discharge, compared with 115 of 268 (43%) without increase (P < 0.0001). Of the two patients who survived to hospital discharge, one died within 24 hours of discharge and the second one, who was admitted to the ICU for multiple trauma, died 3 years after hospital discharge. Conclusion An increase in the APS on the third ICU day of the sickest patients identifies a group of patients whose short-term and long-term prognoses are dismal. Introduction Base excess (BE) and lactate (LAC) have been used to monitor ICU patients. Each results from different pathophysiological derangements in perfusion, inflammation and renal function. The individual significance of BE or LAC to predict the outcome of the critically ill patients is still uncertain and was the focus of this retrospective study. We retrieved 333 patients from our prospective collected database from January to December 2000. Age, diagnosis, APACHE II score, BE and LAC at entrance and after 24 hours of admission were recorded. Univariate and multivariate analyses were performed, the former being based on a matrix of collinearity (Pearson coefficient ≥ 0.4 denotes collinearity) and on the results of the univariate analysis. A receiver-operator characteristic (ROC) curve was built to identify the best predictive value for mortality. The age was 51 ± 18 years, APACHE II was 21 ± 1, BE and LAC at admission were -6.0 ± 7.6 mmol/l and 4.9 ± 9.7 mmol/l, respectively, and the BE and LAC after 24 hours were -5.5 ± 6.2 mmol/l and 5 ± 9.6 mmol/l, respectively. The variations of BE and LAC were calculated as the 24 hours value minus the admission value, and resulted in 0.4 ± 6.6 and 0.03 ± 5.6 mmol/l, respectively. Conclusions Lower values of BE and higher values of LAC are associated with poor prognosis in ICU patients. In a comparative analysis of these two variables, BE measured after 24 hours of admission has the best discriminatory power to predict outcome. . We aimed to assess temporal changes in performance of both of these scores. Specifically, we hypothesised that the 6-year period between commencement of data collection (1998) and publication (2003) of the revised PIM 2 score may have allowed for significant decalibration. Methods A prospective data collection from a single, 20-bed tertiary PICU over 5 years (1999) (2000) (2001) (2002) (2003) . The standardised mortality ratio (SMR) was calculated using the standard formula, discrimination assessed via the area under the receiver-operating characteristic curve (ROC), and calibration using the Hosmer-Lemeshow goodness of fit test. Scores were calculated for 4183 patient episodes (average 800-900 admissions/year). There was no significant temporal variation in either case mix (cardiac surgery 27-31%) or disease severity (data not shown). Both scores discriminated well, consistently yielding an area under the ROC curve > 0.75 (Fig. 1 top) . Not surprisingly, PIM demonstrated a loss of calibration (Hosmer-Lemeshow χ 2 > 15.5, P > 0.05) from 2000 onwards. PIM 2 also showed a temporal trend towards decalibration that became apparent by 2003. This trend was mirrored by a progressive reduction in SMR ( Fig. 1 bottom) , such that the upper confidence limit for the PIM 2 SMR was less than 1.00 by 2002. Interestingly, the SMR derived from either score decreased at a similar rate, 0.07-0.08 per year. Conclusions Both scores continue to discriminate well between survival and nonsurvival. As expected, PIM 2 is better calibrated than PIM, although both appear to be decalibrating at a similar rate. Because PIM 2 reflects the 1998-1999 standard of care, many PICUs will currently exhibit fewer deaths than expected (SMR < 1.00). More frequent calibration appears necessary. A prolonged ICU length of stay (LOS) has been associated with many medical diagnoses and conditions but it is difficult to predict on admission to the ICU. Prolonged ICU LOS can adversely affect patient outcomes by increasing the risk of complications, and possibly mortality. Identification of such indicators may help to improve ICU resource utilization (e.g. bed triage, ICU staffing). Objective To systematically review the literature to determine common 'early' predictors of ICU LOS for adult patients. Results Five studies were identified that were published in full, with a combined total of 16,107 patients. The definition of prolonged LOS varied with the population evaluated (e.g. > 14 days general ICU, > 5 days CVICU) and thus prevented metaanalyses. Approximately 10% of patients had prolonged LOS as defined by the study. Universal positive early indicators were: emergent surgery or admission, trauma, or need for mechanical ventilation in < 24 hours. Abbreviated LOS was associated with coma, DNR orders, and nontrauma surgical reasons for admission. APACHE created as a predictive model for mortality consistently did not predict LOS. Conclusion Patients with prolonged stay form a small percentage of ICU patients. There are few early predictors that are universally significant or important when determining prolonged ICU LOS. Many factors appear to be specific to subgroups and thus may limit broad applicability of general ICU scoring systems to specialized populations. Results A total of 2372 mechanically ventilated patients were included (mean age 68 years, 62% male, 42% cardiovascular, 22% pulmonary, 8% neurologic and 28% other reasons for MV), in 74% intubation (ITN) and MV was performed emergently. On the first day 44% of the patients were ventilated with the bilevel positive airway pressure modus and 39% with the continuous positive pressure ventilation modus. Duration of MV was 2 (1-7) days, and length of CICU stay 20.5 (13-32) days. Overall, 47.3% patients died. Figure 1 shows the distribution of HM in 21 CICUs including > 50 patients in the registry (range HM 33-70%). Emergent ITN, cardiopulmonary resuscitation, ITN in hospital, shock, sepsis, pneumonia, ventricular tachycardia/ fibrillation and sudden cardiac death were associated with HM in an univariate analysis. Limitations Admission severity of illness was not assessed as well as organ dysfunction during the CICU stay and causes of death. Conclusion Mortality is high in mechanically ventilated patients with cardiovascular and pulmonary disorders. The BEAT registry provides important data for benchmarking and is a first step towards quality improvement in this subset of critically ill patients with high mortality and morbidity. Introduction Intensive care physicians are facing a growing number of immunocompromised patients. The evidence on how immunosuppression impacts on ICU outcome is poor. We conducted a retrospective cohort study to compare immunocompromised patients with immunocompetent patients with regard to their ICU outcome and, furthermore, to identify prognostic factors. Methods Over a period of 36 months, 656 patients were referred to a medical eight-bed ICU of a university hospital. Of these, 190 were categorized as immunocompromised. Immunosuppression was defined as an absolute neutrophil count < 1000/µl (n = 66) at admission, or the administration of immunosuppressive drugs (IS) prior to admission (n = 124). IS used usually were corticosteroids and cyclosporine, more rarely FK 506, cyclophosphamide and methotrexate. We recorded demographic data, reason for admission, APACHE III score, occurrence of septic shock and the need and duration of mechanical ventilation (MV). The primary endpoint of the study was ICU survival. The groups were compared by Fisher's exact test for categorical variables and by the Mann-Whitney U test for continuous variables (univariate analysis). Using a multivariate logistic regression model, we controlled for several risk factors. The overall mortality was 33%. Immunocompromised patients had a significantly higher mortality than immunocompetent patients (45% vs 29%, P < 0.001). This difference was even more pronounced if patients required MV (64% vs 34%, P < 0.001). Patients with neutropenia tended to have a higher mortality than patients with IS therapy, but this difference failed to show statistical significance. The presence of septic shock at any time during ICU treatment and the associated mortality from septic shock did not significantly differ between immunocompromised and immunocompetent patients. In multivariate analysis, lower APACHE III scores at admission and admission for postoperative care were independently associated with reduced ICU mortality, whereas immunosuppression and MV were independently associated with unfavourable outcome. Immunosuppression remained to be associated with higher ICU mortality if the statistical model was adjusted for APACHE III score, postoperative care and mechanical ventilation. Conclusion Immunosuppression is an independent risk factor of death in the ICU. In order to develop preventive strategies, controlled studies are needed to identify further risk factors in these patients. This study aims to compare the ability of admission lactate, lactate at 24 hours and the APACHE II risk of death (ROD) score to discriminate between intensive care survivors and nonsurvivors. We also combined lactate at 24 hours with the APACHE ROD score to obtain the best discrimination values. Two hundred and forty consecutive admissions to a nine-bed general intensive care unit (ITU) were prospectively investigated. Lactate values at admission and lactate at 24 hours were recorded. Each patient had the APACHE II ROD score calculated from data submitted to the Intensive Care National Audit and Research Centre. The ITU and hospital mortalities were analysed. The area under the receiver-operator characteristic (ROC) curve was calculated for each measure or combination of measures to test discrimination between survivors and nonsurvivors. The overall intensive care and hospital mortalities were 29% and 39%, respectively. Both lactate on admission and lactate at 24 hours were significantly and strongly associated with intensive care mortality (P = 0.000 for both). Areas under the ROC curves show that both lactate measures discriminated between survivors and nonsurvivors ( Table 1 ). The best discrimination is obtained by a combination of lactate at 24 hours and APACHE probability (obtained from a logistic regression). The 24 hour lactate values that give 50-75% mortalities are 1.5 and 3.6, respectively. Lactate at admission and at 24 hours are significantly strongly associated with intensive care mortality. Seventy-five per cent of patients with a lactate value of 3.6 at 24 hours will not survive. The 24 hour lactate discriminates as well as the APACHE II ROD score between intensive care survivors and nonsurvivors, and is quicker and easier to obtain. Combining lactate into the APACHE II ROD score appears to improve the discriminatory power of this outcome prediction model. These data may help in discussions regarding patient prognoses on intensive care. In view of the strong association and discrimination between intensive care survivors and nonsurvivors and the widespread clinical use of lactate measurements, inclusion of the 24 hour lactate value should be included for future scoring systems. The difference between measured and calculated osmotic pressure (osmolar gap) was recorded in 72 patients admitted to two medical-surgical intensive care units (ICU). The measurement took place immediately after their admission in the ICU independent of the cause of admission. The SAPS II and APACHE II scores were recorded as well as the ICU outcome. Patients were separated retrospectively into two groups on an outcome basis: A = Alive, B = Dead. The results are presented in Table 1 . The osmolar gap calculated during admission as well as the measured osmolarity may be good indicators of the ICU mortality. Although the calculated osmolarity does not seem to be a reliable prognostic indicator in critically ill patients, it is used in the calculation of the osmolar gap, which seems to be a better prognostic indicator of the ICU mortality in comparison with the calculated osmolarity. Patients with an osmolar gap > 24 at their admission in the ICU have 53% probability of death, while patients with an osmolar gap < 14 have 28% probability of death. An osmolar gap of 35 mosm/l at admission is related with increased probability of death by 1.33 times (odds ratio), according to our preliminary results. Introduction Prior studies suggest uninsured patients presenting to the emergency department with acute asthma receive poor quality of care. However, we do not know whether insurance status affects resource use and outcome after hospital admission for patients with status asthmaticus (SA). We examined all nonfederal hospital discharges in 1999 from six US states. We selected patients hospitalized with SA and defined severe SA as SA with ICU admission. We excluded Medicare patients and focused on working-age adults (18-64 years), classifying insurance as commercial, Medicaid, and uninsured. We assigned comorbidity using the Charlson-Deyo index. We examined the association between insurance status and mechanical ventilation (MV), hospital costs, and hospital mortality. Of the 8.2 million hospital discharges, 62,968 (0.8%) had SA and 9742 had severe SA, of whom 4007 (41.1%) were working-age adults (population incidence of 10.7/100,000). Compared with commercially insured patients (n = 2299), Medicaid (n = 1057) and uninsured (n = 651) patients had higher MV rates (30.1% vs 39.1% and 40.1%, P < 0.001) Hospital length of stay (LOS), ICU LOS, mortality and mean costs were 6.2, 8.6 and 5.1 days, 3.5, 4.4 and 2.6 days, 1.9, 4.2 and 1.8% and $8800, $12,600, & $7400, respectively, for commercial, Medicaid and uninsured patients. Adjusting for age, comorbidity, gender, and race, Medicaid and uninsured patients were more likely to receive MV than commercially insured patients (odds ratio:1.51 and 1.41, P < 0.001 for each). Mortality was also higher, but this observation was not significant (odds ratio: 1.17, P = 0.9 and odds ratio: 2.1, P = 0.64). Uninsured patients also incurred lower adjusted hospital costs (P < 0.001) and hospital LOS (P < 0.001). In comparison with patients managed under commercial insurance, patients admitted to hospital under no insurance or state-subsidized insurance appear sicker, as evidenced by higher MV rates and worse mortality. Possible reasons include a worse spectrum of disease in these populations, delayed presentation to hospital during an acute attack, or restricted admission policies. The Mortality in Emergency Department Sepsis (MEDS) score is a previously derived, validated, and published clinical decision rule for predicting 28-day inhospital mortality. It is based on the following scoring system: underlying disease with expected fatality < 30 days or metastatic cancer (six points), tachypnea or hypoxia (three points), septic shock (three points), platelets < 150,000 (three points), bands > 5% (two points), age > 65 years (two points), lower respiratory infection (two points), nursing home resident (two points), anion gap, and altered mental status (two points). The points are added and a risk group assigned. Objective The objective of this study was to assess the ability of the MEDS score to predict 1-year mortality. Methods A prospective observational, cohort study of consecutive emergency department (ED) patients seen at an urban university hospital with 50,000 annual ED visits. The study period was 1 February 2000-1 February 2001. All patients, aged 18 years or older, at risk for infection as indicated by the ED physician ordering a blood culture were included. The MEDS score divides patients into very low risk (0-4 points), low risk (5-7 points), moderate risk (8-12 points), high risk (13-15 points), and very high risk (> 15 points). The raw survival figures are reported with 95% confidence intervals (CIs) and compared with each other using Tukey's Test for pair-wise comparisons. Of 3926 patient visits eligible for the study, 3763 (96%) were included. The overall 1-year mortality was 24% (95% CI: 22.7-25.4%) (904/3763). The 1-year mortalities for the groups were: very low risk 8.6% (0.08-1.2%), low 23% (1.0-3.5%), moderate 40% (5.8-10%), high 68% (11-24%), and very high 79% (37-66%), with all groups being statistically different from each other by Tukey's Test for pair-wise comparisons. The MEDS score is a good predictor of 1-year mortality in patients presenting to the ED with sepsis. Independent multicenter validation is needed prior to widespread application of this rule to test its performance in other patient populations. Available online http://ccforum.com/supplements/8/S1 P337 Iatrogenic occurrences at the intensive care unit: impact on the patients' severity and on the nursing workload This descriptive exploratory study had the objectives to characterize the iatrogenic occurrences and to evaluate its impact on clinical conditions of the patients and on the nursing workload at the intensive care unit (ICU). Data were prospectively collected during a 3-month period, in two general ICUs of a hospital in the city of São Paulo, using a file card to record the occurrences. Patient's severity and nursing workload were evaluated, respectively, by means of the Simplified Acute Physiology Score (SAPS II) and the Therapeutic Intervention Scoring System-28 (TISS-28). The population of the study consisted of 212 patients, of which 47 (22.0%) were victims of 80 iatrogenic occurrences during their stay at the ICU. Among the victim patients, 57% suffered one occurrence, 28.0% two occurrences and 15.0% three or four occurrences. Regarding the type of event, 27.0% were related to pressure ulcer, followed by 24.0% and 20.0%, respectively, regarding handling of the orotracheal cannula and two blood catheters. The others, occurred during administration of medicaments (13.0%), care with probes (10.0%) drains (5.0%) and handling of equipment (1.0%). Occurrences related to substructure were not found. Regarding the impact of occurrences on the patient's severity and on the nursing workload, no statistically significant difference was found in the mean between the SAPS II and TISS-28 scores of victim and nonvictim patients of occurrences, during admission to the ICU. When comparing severity and nursing workload in victim patients, before and after the event, a difference was observed only in the mean scores of the TISS-28 (P < 0.001). Regarding this analysis with the different types of occurrences, a statistically significant difference was found both in the SAPS II (P < 0.042) and TISS-28 (P < 0.001) scores in the occurrences regarding handling of the orotracheal cannula. The results of this investigation reinforce the need for investments to help qualify professionals to work with the critical patients as a major measurement for safety nursing assistance and quality in the ICU. The most common stage of medication error was reported in one study to be the prescribing stage, which accounted for 56% of errors detected [1] . Electronic prescribing (EP) with decision support has been shown to reduce the medication error rate [2] , but no studies could be found in critical care. This study compares the impact of this change on the medication error rate before and after the implementation of the GE Systems QS 5.6 clinical information system (CIS), which does not have decision support. During the study periods, all medication errors identified by the ICU pharmacist were recorded. Errors were identified using a published definition [3] , except abbreviations of drug names were not regarded as errors. ICU staff were unaware that the study was taking place. The location was a 22-bed general ICU/HDU at a teaching hospital. Data were collected in 2002 for two periods of 9 days in total before introduction and for four periods (17 days in total) on weeks 2, 10, 25 and 37 after CIS introduction. The total number of drugs prescribed was recorded. There was a statistically significant reduction in the medication error rate following the introduction of CIS. The error rate before CIS was 6.7% (69 errors from 1036 prescriptions) and after CIS introduction was 4.7% (115 errors out of 2429 prescriptions) (χ 2 = 5.34, one degree of freedom [df], P < 0.03). There was variation of the error rate with EP over time (χ 2 = 21.7, three df, P < 0.001) as the staff got used to the new system and prescribing systems were improved. There is also strong evidence of a linear trend (χ 2 = 11.9, one df, P < 0.001). Thus the error rate appeared to reduce with time with EP. In conclusion, introduction of the CIS coincided with a reduction in the overall medication error rate, with some suggestion of a 'learning curve'. P339 Signal merging and signal fusion to enhance robustness and reduce false alarms during multichannel patient monitoring It is recognised that false alarms lead to habituation among clinical staff and to distress for patients. The goal of our work is to reduce the number of false alarms produced by a standard patient monitor and to provide early warning of patient deterioration based on multiple vital signs. Two approaches have been used. In the first ('signal merging'), multiple signals are merged in order to produce a more robust derived measure. For example, cardiac and respiratory information are extracted from the electrocardiogram and photoplethysmogram, and are combined to produce more robust estimates of heart and respiration rates. The second ('signal fusion') uses a combination of advanced signal processing methods to 'integrate' multiple vital signs, to develop a global representation of patient status. The hypothesis is that if a monitoring system can identify periods of physiological instability preceding clinically apparent adverse events, then early warning may result in reduced monitoring alarms, prompt treatment and improved outcome. Over a 3-year period at the John Radcliffe hospital in Oxford, UK, we collected noninvasive physiological data for over 24 hours in 150 'high-risk' patients, including the continuous heart rate, respiratory rate, oxygen saturation, temperature and intermittent noninvasive blood pressure. The system was 'trained' to distinguish periods of physiological stability from instability. Two evaluation studies have taken place. The first assessed the signal merging technology during simulated patient transport. Results showed that signal merging gives a more precise estimate of respiration rate than a standard transport monitor, suggesting that signal merging may reduce the number of false alarms caused by external movement. The second study assessed the alarms of a standard patient monitor versus our signal fusion technology. Numerous standard monitor alarms were recorded that did not correspond to physiological events, often due to signal artefact or nonclinical events such as patient movement, but typically these did not trigger an alert from our signal fusion technology. Signal fusion identified physiological events that included atrial fibrillation, elevated blood pressure, extreme distress and oxygen desaturation. The conclusions are that signal merging and fusion techniques may improve robustness, reduce false-positive alarms and potentially provide early warning of adverse physiological events. Introduction Frequent exposure to stressful events in the ICU may have been associated with increased prevalence of depression and burnout. This study was planned to assess the burnout syndrome and its relation with depression in nursing staff of the ICU in Turkey. Mixed, surgical and internal medicine intensive care unit (ICU) nurses and non-ICU nurses (n = 178) who have been working in a metropolitan university (UH) and in state hospitals (SH) for at least 1 year enrolled to the study (UH mixed ICU n = 21, UH surgical ICU n = 19, UH internal medicine ICU n = 17, SH mixed ICU n = 18, SH surgical ICU n = 23, SH internal medicine ICU n = 21, UH anaesthesia nurses n = 22, UH operating room nurses n = 37). Anaesthesia and operating room nurses, and non-ICU nurses, were considered as control groups as they do not take part in the follow-up of the patients. Beck depression (0-63), and the Maslach burnout inventory adapted for a Turkish population were used to assess depression and burnout respectively [1, 2] . The latter has three subscales: emotional exhaustion (0-36), depersonalisation (0-20) and personal accomplishment (0-32). One-way ANOVA with Bonferroni's post hoc test and the chi-square test were used and P < 0.05 was considered significant. Pearson correlation analysis was done to investigate the ingroup correlation of Beck depression and subscales of the Maslach burnout inventory. Values are expressed as the mean and SD. The demographic data were similar among groups. The Beck depression scores of the nurses in the university hospital (16.6 ± 9) were higher than that in the state hospitals (13.1 ± 8) (P = 0.028). The Beck depression scores of the mixed ICU nurses in the university hospital (19.8 ± 8) were higher then that in the non-ICU nurses (12.8 ± 7.6) (P = 0.001). Depersonalisation scores of surgical ICU nurses in the university hospital (7.8 ± 4) were higher than those in the non-ICU nurses (5.6 ± 3.6) (P = 0.03). Although a significant difference was not found among the nurses working in different ICUs in terms of the Beck depression inventory, mixed ICU nurses had considerable depression (mixed ICU 17.4 ± 8, surgical ICU 13.6 ± 9, internal medicine ICU 13.5 ± 8 or non-ICU group 12.8 ± 7.6) when considering all ICU nurses together. Positive correlation was found in all groups between the Beck depression score and emotional exhaustion except in the control group (P < 0.01, r >v0.52). Conclusion Nurses operating in the ICU, especially in the mixed ICU, may have a tendency to depression. All ICU staff should be Patients and methods From June 1997 to December 1999 all admissions (≥ 18 years) to our medical intensive care unit (ICU) who were treated for at least 24 hours were eligible. On admission, the pre-ICU functional status and subjective well-being were assessed by interview [1] . Six months after admission survivors' memory of the ICU stay was assessed (none, positive, negative). At 18-month follow-up a standardized interview at the patients home was performed using the PTSD-10 Questions Inventory (PTSD-10) [2] , the 90-item Revised Symptom Checklist (SCL-90-R), the Hamilton Anxiety Scale and Hamilton Depression Scale, the 57-item Giessen Subjective Complaints List and a 28-item quality of life (QOL) scale. A total of 444 patients were enrolled. Cumulative mortality rates were 23% in the ICU, 33% in the hospital, 42% at 6-month follow-up and 53% at 18-month follow-up. From the 209 survivors, 22% were lost to follow-up, 27% were unable to be interviewed due to physical or cognitive reasons and 13% declined the interview. The remaining 80 study patients had a mean age of 46 ± 12 (± SD) years; 69% were male, the mean ICU length of stay was 12 ± 17 days, the mean APACHE II score after 24 hours was 19 ± 9 and the mean SOFA total maximum score was 6.3 ± 4.7. According to PTSD-10 criteria 10 patients (12.5%) had a diagnosis of PTSD. PTSD was more frequently diagnosed in patients who had reported poor pre-ICU subjective well-being compared with patients with good subjective well-being (8/41 vs 2/39 patients; 20% vs 5%; P = 0.05), in patients with multiple organ dysfunction (MOD) compared with patients without MOD (8/38 vs 2/42 patients; 21% vs 5%; P = 0.03), and in patients who had negative or no memories of their ICU stay compared with patients with positive memories (7/30 vs 3/50 patients; 23% vs 6%; P = 0.02). Patients with PTSD had significantly (P < 0.0001) worse scores on the SCL-90-R global index of psychopathology showed a significantly (P < 0.0001) higher degree of somatic and psychic anxiety, major depression, bodily complaints and mental exhaustion, and reported poorer self-perceived QOL. Conclusion A small subgroup (12.5%) of our medical ICU survivors developed PTSD. Subjective well-being before ICU admission, MOD, and ICU memories were associated with PTSD and related psychopathologic symptomatology. These criteria could be used to identify survivors at risk for developing PTSD. Background Cost considerations may influence therapeutic reasoning and decisions in the intensive care unit (ICU). To date only very few data illuminating the association of costs and consequences (i.e. outcomes) of critical care services are available. In this study, the long-term outcome, health-related quality of life (HRQL), and ICU and hospital costs of medical ICU patients were assessed. per life year saved, respectively. The mean costs per QUALY were €3101. Increasing severity of illness (SAPS II quartiles) was associated with higher ECPS and costs per QUALY (Table 1) . Conclusion A large proportion of patients survived > 5 years and reported a good HRQL. Considering the severity of illness and the patients' outcome, ECPS and costs per QALY were low compared with other therapeutic approaches (e.g. beta-blocker postmyocardial infarction, €20,000 per added life year). Health-related quality of life (HR-QOL) is a relevant measurement of intensive care unit (ICU) outcome. This study aims to evaluate the HR-QOL in survivors of severe sepsis and septic shock (sepsis group) compared with survivors without sepsis (control group), using a specific questionnaire (QOL-SP), developed by Fernandez. Methods Six months after ICU discharge, survivors went to a follow-up consultation and the QOL-SP was applied. The QOL-SP comprises 15 items grouped in three subscales, which evaluate Basic Physiologic Activities (BPA), Normal Daily Activities (NDA) and Emotional State (ES), and enables the calculation of a global index (QOL-SP Index). Patients from the sepsis group were compared with those from the control group concerning background variables (age, sex, gender, previous health state), ICU variables (reason for admission, APACHE II score and length of ICU stay) and QOL-SP variables. Patients younger than 18 years old and those with an ICU stay ≤ 1 day were excluded. Patients exhibiting nonsevere sepsis at or after ICU admission, and those with severe sepsis/septic shock after ICU admission were also excluded. Between March 1997 and March 2001, a total of 1285 patients were admitted to the ICU, from which 697 were included in the study. From these, 305(44%) were admitted for severe sepsis/septic shock. Mortality in the sepsis group was 34% and in the control group was 26%. One hundred and four patients from the sepsis group and 133 patients from the control group completed the QOL-SP questionnaire. There were no differences in age and previous health state between both groups. Patients from the sepsis group exhibit a significantly higher APACHE II score and a significantly higher ICU stay. Sepsis survivors reported significantly less problems in the BPA and NDA subscales; furthermore, although not statistically significant, they exhibit a better QOL-SP index than the control group (Table 1) . Conclusion At 6 months after ICU discharge, when evaluated with a specific critical care questionnaire, survivors of severe sepsis/septic shock exhibited a similar, if not a better, HR-QOL than ICU survivors without sepsis. Comparison of the Biomedica NT-proBNP enzyme immunoassay and the Roche NT-proBNP chemiluminescence immunoassay: implications for the prediction of symptomatic and asymptomatic structural heart disease Sodium loading and posture modulate human atrial natriuretic factor plasma levels The effect of intravenous saline loading on plasma levels of brain natriuretic peptide in man Measurement of circulatory and respiratory failure using less invasive hemodynamic monitoring. In Yearbook of Intensive Care and Emergency Medicine Measuring agreement in method comparison studies Prognostic value of extravascular lung water in critically ill patients A prospective study of lung water measurement during patient management in an intensive care unit A: Comparison of invasive and noninvasive measurements of indocyanine green plasma disappearance rate in critically ill patients with mechanical ventilation and stable hemodynamics National Institute for Clinical Excellence: Guidance on the use of ultrasound locating devices for placing central venous catheters The thrombosed prosthetic valve Guidelines for management of left-side prosthetic valve thrombosis: a role for thrombolytic therapy P99 Incidence of deep vein thrombosis in critically ill ICU patients with a femoral central venous catheter P Myrianthefs Heparin-induced thrombocytopenia in patients treated with LMWH or UFH Impact of the patient population on the risk for heparin-induced thrombocytopenia Heparin-induced thrombocytopenia and cardiac surgery Scottish Intensive Care Society: Guidelines for The epidemiology of sepsis in Scottish Intensive Care Units References 1 63% Candida albicans isolated. A. baumannii was multiresistant (MULTI-R) against antibiotics in 17.65% of patients (or in 11.65% of cultures), or with intermediate sensitivity only in Gentamycin (GENT-I) In the D group, 36.36% had MULTI-R or GENT-I A. baumannii (23.73% of cultures): 24.24% MULTI-R (13.56% of cultures 09% of cultures), 2.86% GENT-I (4.55% of cultures). P. aeruginosa was MULTI-R in 42.31% of patients (44.12% of cultures): in 54.55% of the D group (55.56% of cultures) vs 33.33% of the R group patients (31.25% of cultures). K. pneumoniae was sensitive only in imipenem or cotrimoxazole in 7.69% of the R group and only in kinolones in 11 Continuous infusion ceftazidime in intensive care: a randomized controlled trial Anestesia pediatrica per risonanza magnetica cerebrale con maschera laringea: il mantenimento in presenza di ridotta compliance intracranica The use of propofol infusion in paediatric anaesthesia: a practical guide Glucose control and mortality in critically ill patients Attempting to maintain normoglycemia during cardiopulmonary bypass with insulin may initiate postoperative hypoglycemia. Anesthesia Analgesia Bouillon R: Intensive insulin therapy in critically ill patients Outcome benefit of intensive insulin therapy in the critically ill: Insulin dose versus glycemic control In: Artificial Nutrition Support in Clinical Practice Trace elements. In: Nutrition in Critical Care Selenium and zinc plasmatic levels in intensive care patients A Kazda 1 , H Brodska 2 , H Vinglerova 2 Should we measure cerebral blood flow in head-injured patients? Medical imaging at Guy's Hospital, King's College London Defining ischemic burden after traumatic brain injury using 15O PET imaging of cerebral physiology Voxel-based mapping of irreversible ischaemic damage with PET in acute stroke The Wessex Head Injury Matrix (WHIM) main scale: a preliminary report on a scale to assess and monitor patient recovery after severe head injury Practical assessment of brain dysfunction in severe head trauma Evaluation of coma and vegetative states The Western Neuro Sensory Stimulation Profile: a tool for assessing slow-to-recover head-injured patients Doors and People: A Test of Visual and Verbal Recall and Recognition. Suffolk: Thames Valley Test company A simple objective technique for measuring flexibility in thinking LEXIS: Tests pour le diagnostic des troubles lexicaux chez le patient aphasique Echelle de Vocabulaire en Images Peabody. Adaptation française du Peabody Picture Vocabulary Test The National Transplant Co-ordination Centre information bulletin Outcome measures in the adult critical care: a systematic review Maslach Burnout Inventory Manual P344 Five-year survival, quality of life, and individual costs of 303 consecutive medical intensive care patients J Graf 1 , J Wagner 1 , C Graf 1 , K Koch 1 , P Hanrath 1 , U Janssens 2 1 University Hospital Aachen, Germany; 2 Caritas Hospital Conclusions A significant percentage of ICU patients have trace mineral deficiencies, despite well-dosed parenteral (and/or enteral) feeding regimens. Low plasma levels are not unequivocal to interpret [2], but our results support a more prominent role for research and re-evaluation of the current recommended nutrition standards for ICU patients. A postoperatively reduced concentration of AA in plasma (< 45.5 µmol/l) is common due to increased metabolic requirements. Positive effects of AA in the postoperative period are well known (e.g. radical scavenger activity); hence there is an indication for the substitution of AA. However, the dosage commonly recommended for substitution during clinical nutrition is not adequate. Therefore, a postoperative AA substitution procedure 'overnight' to normal values in plasma was investigated in this randomized, prospective study on a postoperative ICU in a university hospital.Fifty-seven electively operated patients were assigned to a control group or an intervention group. In all patients the AA plasma concentration was analysed preoperatively and on the first three postoperative days. Patients of the intervention group received AA intravenously 'overnight' up to four times within 12 hours depending upon the initial AA concentration (< 34.1 µmol/l [4 × 500 mg AA]; < 56.8 µmol/l [2 × 500 mg AA]; < 68.2 µmol/l [1 × 500 mg AA]).The preoperative and postoperative AA values on the first postoperative day did not differ between both groups. However, the postoperative plasma concentration was lowered (< 45.5 µmol/l) in 82.4% of all patients.In the intervention group, the dosage regime increased the AA plasma concentration to > 45.5 µmol/l in 89.6% overnight.In conclusion, the investigated substitution procedure is sufficient to increase the AA plasma concentration overnight to high normal values in postoperative ICU patients. The aim of this pilot study was to assess the possible deficiency of the standard nutrition protocol, based on the daily recommended doses (DRD) in the literature [1] , in the surgical ICU for trace minerals. An intervention with extra supplements given in addition to the standard formula was evaluated.Methods A prospective, observational pilot study in a surgical ICU of a tertiary referral centre. Forty-eight intensive care patients with two or more organ failures were included in the study (APACHE II score 24.2 ± 7.9). All patients received total parenteral nutrition according to a standard protocol. Plasma measurements of manganese, selenium and zinc were performed before and after 7 days of extra supplementation with a commercial formula containing one DRD1 of each element (Addamel ® N; Fresenius Kabi, 's-Hertogenbosch, The Netherlands). Twenty-five patients were also screened for copper and chromium levels and their response to supplements. The overall daily caloric intake in the week before inclusion into the study was 1693 ± 841 kcal and during the study period 2211 ± 543 kcal (P = 0.002). Copper as a substantial element in the normal function of oxidative enzyme systems (so-called 'cuproenzymes') and in plasma primarily bound to ceruloplasmine, an acute phase protein, is difficult to interpret in critically ill patients. However, levels at the start were normal (14.5 ± 6.3 µmol/l, n = 10.0-30.0 µmol/l) and could be raised significantly (17.4 ± 4.6 µmol/l, P = 0.004). Ceruloplasmin levels were within the normal range and did not change significantly over the study period (0.34 ± 0.11 g/l, n = 0.24-0.62 to 0.37 ± 0.10 g/l, P = 0.414). Manganese is part of the mitochondrial superoxide dismutase and important for the metabolic effects of vitamin K. In our population normal starting levels were found (30.5 ± 13.7 nmol/l, n = 2-37 nmol/l) and were raised significantly (37.0 ± 16.3 nmol/l, P = 0.021). Selenium as a co-factor in the erythrocyte glutathion peroxidase complex has a protective role against peroxides. (Very) low baseline levels were found (0.53 ± 0.22 µmol/l, n = 0.8-1.8 µmol/l) and the supplement, although double the DRD, could not normalize this (0.71 ± 0.28 µmol/l), but the improvement was statistically significant (P < 0.0001). Albumin as the transport protein for selenium was low and did not change significantly (20.9 ± 6.0 g/l, n = 35-50 to 21.1 ± 6.5 g/l, P = 0.959). Zinc metabolism and physiology are subject to debate. Zinc deficiency, however, is known for impaired wound healing, alopecia and immunologic dysfunctions. Almost all patients were deficient at the start (8.6 ± 3.6 µmol/l, n = 11.5-23.5 µmol/l), and then supplementation did result in significant improvement, but only just to normal levels (11.4 ± 2.6 µmol/l, P < 0.0001). Chromium is a cofactor in insulin metabolism and glucose utilisation. Plasma levels are difficult to assess, because of their biologic significance. However, starting levels were high (86 ± 52 nmol/l, n = 9.6-50 nmol/l) and were raised nonsignificantly (90 ± 45 nmol/l). The chromiumtransporting protein transferrin was low and did not change significantly (1.3 ± 0.9 g/l, n = 2.0-3.5 to 1.3 ± 0.4 g/l, P = 0.475).The study was aimed at investigating plasmatic levels of zinc (Zn) and selenium (Se) in critical care patients, and the influence of standard supplementation of those elements on such levels. One hundred and eighty-one measurements were made in 55 patients. They were divided into two groups: CARDIO group, 18 patients after cardiovascular surgery, well nourished, without supplementation of trace elements; ICU group, 37 patients in acute protein catabolism with daily average suplementation of 0.89 mmol (70 mg) Se and of 107 mmol (7 mg) Zn. On days 0, 2, 3 and 7 of the hospitalization the following factors were monitored: (1) plasmatic levels of Se and Zn using atomic absorption spectrophotometry and electrothermic atomisation, and (2) prealbumin, albumin and CRP turbidimetrically, orosomucoid nephelometrically. By means of the last four parameters the prognostic nutritonal index (PINI) was calculated.Reference values Se 0.58-1.82 mmol/l, Zn 10.9-18.4 mmol/l, PINI ≤ 1.0, CRP 0-10 mg/l. Zinc In the CARDIO group significant elevation of CRP and PINI prove the acute metabolic response to trauma, and their changes corespond to dynamics of Se and Zn levels. The decrease of Zn between the first and second investigation was significant (P < 0.001). The inverse dependence between Zn:CRP was also significant (P < 0.001, r = -0.387). For the ICU group the decrease of Zn levels is characteristic, lasting under Zn substitution. Neither the Zn:CRP relation, nor Zn:PINI were significant. Selenium Levels in both groups were significantly decreased. In the CARDIO group the inverse dependency between Se:CRP (P < 0.01, r = 0.306) was again significant. In the ICU group there was significant dependence for the Se:CRP relation (P < 0.001, r = -0.293) and Se:PINI (P < 0.001, r = -0.308). Objective In several institutions, intraventricular catheter (IVC) placement and decompressive craniectomy (DC) have been included in the treatment protocols of the patients with severe head injury besides the conventional therapies including normoventilation, normothermia, sedation-analgesia and osmotic diuresis. In this study, we compared the mortality and morbidity rates of the patients treated by two different therapy protocols in different periods (conventional therapy group vs conventional + IVC-DC group).Methods One hundred and seventeen patients with a Glasgow Coma Scale score ≤ 8 were included in the study. The duration of mechanical ventilation, the ICU stay, surgical procedures, computerised tomography results, and Glasgow Outcome Scores were prospectively recorded. Conventional therapy was performed in 48 patients (Group I) between the years of 1997 and 2000. After 2000, IVC placement and the DC procedure were added to the treatment protocol and 69 patients were treated according to this therapy protocol (Group II). The groups were similar in terms of age and severity of trauma. In Group I, 16 patients were discharged in good neurological condition (33%), 12 patients in bad neurological condition (25%) and 20 patients (42%) died. In Group II, 34 patients in good neurological condition (49%) and 16 patients in bad neurological condition (23%) were discharged and 19 patients (28%) died (P > 0.05). The mean durations of mechanical ventilation in Group I and Group II were 8.9 ± 7.6 days and 11.8 ± 10.1 days, respectively (P > 0.05). The mean ICU stay was18.1 ± 17.9 days in Group I and 17.8 ± 13.9 days in Group II (P > 0.05).Conclusion Although the prognosis of the patients in Group II seemed better than the conventional therapy group, the difference was not statistically significant. Although rigorous control of fever is the current standard of care for the brain-injured patient, patient management strategies currently available are often suboptimal and may be contraindicated. Objectives We investigated a subcutaneous very low dose of diclofenac sodium (DCF SC) to treat fever in the NICU.Methods DCF SC (0.17 mg/kg [≈ 1/6 fl]) was administered to febrile patients and its effect was recorded continuously for 6 hours on the temperature (T°), intracranial pressure (ICP), cerebral perfusion pressure (CPP), mean arterial pressure (MAP), heart rate (HR) and diuresis. Adverse effects of DCF (allergic, gastrointestinal and central nervous system bleeding) were monitored. Until recently, influenza-associated acute encephalopathy (IAE) had been rare in Europe and the United States. Recently, however, some investigators from these regions have reported on IAE. This severe complication of influenza is more common in Japan and is often fatal. The mortality rate is 30% and the rate of severe neurological sequelae is 30%. Since 2002, we have included intracranial pressure (ICP) monitoring in the management of infants with IAE in our center. This report summarizes our experience with ICP monitoring of IAE.Methods We describe four infants aged 1-5 years (two boys and two girls) who were pyrexial and comatose or who had convulsions. Three were healthy prior to admission but one had a craniopharyngioma. Antibodies to influenza A virus were detected in all four patients. ICP monitoring was started as soon as possible after admission. The patients' cerebral perfusion pressure (CPP) was maintained above 50 mmHg. In three cases, the ICP was maintained below 25 mmHg. In these cases, the ICP was controlled with antipyretic therapy and pentobarbital infusion. In each of these cases, a computerised tomography (CT) scan of the brain showed bilateral thalamic hemorrhage and slight edema on the day of admission. In the other patient, the ICP suddenly increased to more than 50 mmHg, and a vasoconstrictor infusion (noradrenaline 0.5 µg/kg/min) was needed to maintain the CPP on the second hospital day. The brain CT and magnetic resonance imaging (MRI) demonstrated severe cortical brain edema, similar to that described in the hemorrhagic shock and encephalopathy syndrome (HSES) in Europe. All the patients survived; two made remarkable recoveries, including the patient who needed the vasoconstrictor infusion.Conclusions Before instituting ICP monitoring, it was difficult to determine the most appropriate therapy for severe IAE and survivors usually had significant neurological sequelae. (1) The control of CPP in patients with IAE is just as important as it is in the management of patients with severe head injuries. (2) Based on the ICP, the CT and MRI scans, IAE could be divided into two groups. (3) One type of IAE seen in Japan may be related to the HSES described in Europe. Background Severe head trauma patients are often deeply sedated and this can cause a myocardial depression with hypotension and low cardiac output. Hypotension is very dangerous, particularly in patients with cerebral oedema and high intracranial pressure. Dobutamine and/or norepinephrine are usually given to restore myocardial performance and increase blood pressure and, consequently, cerebral perfusion pressure (PPC). The effects of these two drugs on the cerebral blood flow (CBF) are unknown. The aim of this study is to measure CBF and central hemodynamics in a group of patients with severe head trauma (Glasgow Coma Scale < 8), first using dobutamine and then norepinephrine.Methods Five patients have been studied. The following parameters were monitored: intracranial pressure, jugular oxygen saturation (SjO 2 ), mean arterial pressure, central hemodynamic parameters (cardiac index, central venous pressure, systemic vascular resistances, intrathoracic blood volume, extravascular lung water) and CBF. The CBF was measured with the TCDI technique (COLD-system; Germany) [1] . Patients had two fiberoptic catheters provided with a thermistor: the first inserted in the femoral artery and the second in the bulb of the right jugular vein.First patients had a dobutamine infusion (t1) to reach PPC > 70 mmHg (mean dosage 6.7 µg/kg/min), and then, after central hemodynamic and CBF measurements, dobutamine was changed with norepinephrine (0.12 µg/kg/min) (t2), trying to reach the same PPC achieved with dobutamine. Finally, measurements were performed with simultaneous infusion of dobutamine and norepinephrine (dobutamine 4.2 µg/kg/min and norepinephrine 0.05 µg/kg/min) (t3), always achieving the same PPC as t1. The CBF increased significantly with norepinephrine (median value 47.7 ml/100 g vs 72 ml/100 g; P < 0.05), but not with the association dobutamine-norepinephrine (median value 52.5 ml/100 g). No significantly changes were observed in central hemodynamic parameters.Conclusions From these preliminary data it seems that dobutamine causes a decrease of CBF compared with norepinephrine, with no changes in central hemodynamic parameters and PPC. The physiopathology of these results remains unknown. Introduction In Poland, in 2002, the number of organ donors per one million inhabitants was 12.68, and was 8.9% higher than in 2001 (11.64/million). But still many potential organ donors are missed. The objective of this study was to identify the major causes of missed donors in a university hospital (600 beds) with a busy neurosurgery ward. The Barlicki University Hospital, with an eight-bed ICU, is located in the second biggest city in Poland, with one million population. As the second in the city the transplantology centre started in 2000. A retrospective audit of ICU documentation from 2000 to 2003 of patients with declared brain steam death was conducted. After collecting the information, a computerised Excel database was created. The mean age of potential organ donors was 44. Conclusions Changes in the Transplant Law allowing for organ donation in cases of cardiovascular failure or cardiac arrest may increase furthermore the number of organ donors. A discussion of whether the relatives should be asked for consent is required. There are still many small hospitals that do not cooperate with transplant centres. The involvement of this hospitals in an organ donation program is the major task for the coming years. Reis-Miranda, one nurse can handle 46 TISS-28 points and a TISS-28 < 20 has been suggested to qualify as medium care. In order to rationalize the use of intensive care resources in our 56-bed surgical ICU, which has a 1:2 nurse to patient ratio, we developed a tool for determining the required number of medium care beds. Comparison with a 11-bed MCU, which has a 1:3 ratio, was made. Methods A prospective observational study was designed in which the attending nurses scored TISS on a daily basis. In addition, ICU nurses and the attending physicians daily gave an opinion on whether or not the patient could have been taken care of in a MCU. Physicians were blinded for TISS scores. The label medium care was given when both nurses and physicians agreed on this issue. Daily TISS scores in the ICU were related to the subjective opinion of nurses and physicians. The ethical committee approved the study protocol. All patients, except those with a burn injury, admitted to the ICU and to the MCU during an episode of 4 months were included, which provided 6077 patient-days in the ICU and 875 patient-days in the MCU for analysis. In the ICU the median (P25-P75) TISS was 34 (28-40), vs 27 (22-32) in the MCU (P < 0.0001). In the ICU, but not in the MCU, the median TISS was significantly higher during week days than during weekends (35 [29-41] vs 32 [28-38], respectively; P < 0.0001). Taking actual nurse staffing into account, ICU nurses on average handled 70 TISS points and MCU nurses 81 TISS points. In the ICU, 11% of patient-days were considered medium care by both physicians and nurses, which translated into up to five medium care beds within 56 ICU beds. Patient-day-labeled medium care scored 26 (23-29) for TISS, as compared with 35 (30-41) for intensive care (P < 0.0001). A receiver-operator characteristic curve revealed an area under the curve of 84.3% and identified a TISS of 29 as the best cutoff for medium care. Using this definition, 25% of all ICU and 57% of all MCU patient-days would be labeled medium care. However, this cutoff yields 19% false-positives. Using a cutoff of 20 TISS points, only 3% of ICU patient-days and 15% of MCU patient-days would qualify. Our study provided a nurse-physician consensusbased definition of medium care, which equalled a median TISS of 26, a value approaching that of a MCU in our hospital setting. Using such a quantitative definition of medium care allows more rational allocation of nursing staff.Available online http://ccforum.com/supplements/8/S1 Introduction Mortality in the UK is known to be higher in winter than in nonwinter, but the comparative importance of variation in case mix and increased pressure on hospitals is not clear. We explored this issue using data from the national audit of critical care admissions, the ICNARC Case Mix Programme. Methods Using data from 113,389 admissions to 115 adult, general critical care units in England, Wales and Northern Ireland from 1995 to 2000, we investigated whether hospital mortality following admission to critical care was higher in winter (defined as the first working day in January-31 January) than in nonwinter (March-November inclusive) and explored the causes of any observed differences in terms of the case mix of admissions and the workload of the units. Crude hospital mortality was higher in winter than in nonwinter (odds ratio [OR] 1.18, 95% confidence interval [CI] 1.11-1.25). After adjusting for case mix using the APACHE II mortality probability, this effect was reduced but still significant (OR 1.11, 95% CI 1.04-1.18). When additional factors reflecting case mix and workload were introduced into the model, the overall effect of winter was no longer significant (P = 0.08). Factors reflecting both the case mix of the individual patient and of the patients in surrounding beds were found to be significantly associated with outcome. After adjustment for other factors, the occupancy of the unit (proportion of beds occupied) was not significantly associated with mortality. The excess winter mortality in UK critical care units can be explained by the variation in the case mix of admissions. Unit occupancy was not associated with mortality. Aim The aim of this study is to assess the association of PIROrelated variables with the occurrence of bacteremia in patients admitted to the intensive care unit (ICU) with infection. Methods This retrospective, cohort study included 191 patients admitted to three ICUs of a tertiary care medical center, with an infection-related diagnosis between 1 July and 31 December 2002. Bacteremia was defined as the isolation of a bacterium in one or more blood cultures. The P-related variables include evidence of immunosuppression, dialysis-dependent renal failure, nursing home residence, and performance status. The I-related variables include site of infection and place of acquisition. The Rrelated variables include heart rate, respiratory rate and temperature. Sequential Organ failure Assessment (SOFA) scores were calculated to quantify organ dysfunction. We used Student's t test, the Mann-Whitney U test, the chi-square test and Fisher's exact test for comparison between bacteremia and nonbacteremia groups. To define which variables were significantly and independently associated with bacteremia, we performed logistic regression analysis including day 1 SOFA score, temperature > 38°C, heart rate, immunocompromised status and site of infection (urinary 1, other 0). P < 0.05 was considered significant. The mean (SD) age of the patients was 64 (16) years and 92% were Caucasian. Bacteremia was documented in 44 (23%) patients. In univariate analysis, only the SOFA score was associated with the occurrence of bacteremia. The mean (SD) SOFA score in the bacteremia group was 7.3 (4) vs 5.6 (3) in the nonbacteremia group (P = 0.012). The day 1 SOFA score was also found to be an independent predictor of bacteremia (P = 0.0057) along with day 1 temperature > 38°C (P = 0.016). The SOFA score, used as a measure of organ dysfunction in the PIRO concept, may have a role in the prediction of bacteremia in patients admitted to the intensive care unit with infection.