key: cord-0004725-8zer55tt authors: Nicoll, A. title: Pandemic risk prevention in European countries: role of the ECDC in preparing for pandemics: Development and experience with a national self-assessment procedure, 2005–2008 date: 2010-12-01 journal: Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz DOI: 10.1007/s00103-010-1163-3 sha: 8a0e5e8f999b67645d9e22bca35691a8878ad502 doc_id: 4725 cord_uid: 8zer55tt To be effective risk prevention work takes place well before pandemics through the three Ps: Planning, Preparedness and Practise. Between 2005 and 2008 the European Centre for Disease Prevention and Control (ECDC) worked with the European Commission (EC) and the WHO Regional Office for Europe (WHO-Euro) to assist European countries in preparing themselves for a future influenza pandemic. All eligible countries in the European Union and European Economic Area participated with energy and commitment. Indicators of preparedness were developed based on WHO planning guidance and these were set within a simple assessment which included a formal country visit. The procedure evolved considerably with field experience. As the complexity of pandemic preparedness was appreciated it changed from being a classical short external assessment to longer national self-assessments with demonstrable impact, especially when self-assessments were published. There were essential supporting activities undertaken including a series of pan-European pandemic preparedness workshops organised by EC, WHO-Euro, ECDC and countries holding the European Union Presidency. The self-assessments highlighted additional work and documentation that was needed by national authorities from the ECDC. This work was undertaken and the document produced. The benefits of the self-assessments were seen in the 2009 pandemic in that EU/EEA countries performed better than some others. A number of the guidance documents were updated to fit the specific features of the pandemic. However the pandemic revealed many weaknesses and brought new challenges for European countries, notably over communication and vaccines, the need to prepare for a variety of scenarios and to factor severity estimates into preparedness, to improve surveillance for severe disease and to deliver seroepidemiology. Any revised self-assessment procedure will need to respond to these challenges. To be effective pandemic risk prevention work has to take place well before pandemics through the three Ps: Planning, Preparedness and Practise [1] . Global influenza surveillance work has been led by WHO since the early 1950s and although pandemics occurred at irregular intervals in the 20 th Century and before (. Fig. 1 ) formal national and international pandemic planning only started late in the 20 th century. The first published national pandemic plans appeared in Europe in the 1990s, some stimulated by the emergence of a potential pandemic influenza A(H5N1) in Hong Kong in 1997 [2] . The first WHO guidance pandemic plan appeared in 1999 but was based on limited international consultation [3] . More considered work began in early 2002 with the formal development of a global influenza agenda including pandemic planning [4, 5] leading to a resolution at the World Health Assembly in 2003, This also contained the first targets for seasonal influenza immunisation [6] . The experience of SARS in 2003, another albeit considerably different acute respiratory viral infection, gave stimulus to this work as did the development of the first comprehensive International Health Regulations (IHR 2005 ) that could be used when declaring a pandemic [7] . The first proper global guidance on pandemic planning was adopted by WHO in 2005 along with a seminal checklist [8, 9] . In Europe the precursors to the European Influenza Surveillance Network (EISN) had been active since the late 1980s. In 2001 the European Commission convened the first European Union conference on pandemic preparedness (. Tab. 1). Particular stimulation for strengthening pandemic plans and preparedness came from the re-emergence of influenza type A(H5N1) in China and South East Asia in 2003 to then affect birds and some humans in the rest of Asia and Europe in [2004] [2005] . When the European Commission (EC) and WHO Regional Office for Europe (WHO-Euro) in March 2005 convened the first European Pandemic Preparedness Workshop the European Member States as a whole became more serious in pandemic planning. This included a review of EU/EEA Member States' paper pandemic plans and later that year the EC issued a Communication on Pandemic Planning for the European Union countries [10] . The European Centre for Disease Prevention and Control (ECDC) opened in May 2005 in the midst of this accelerated interest. Its Director made pandemic preparedness its first disease-specific priority, alongside that of establishing the basic infrastructure. The ECDC did this by drawing from the expertise of its four Technical Units (Health Communications, Preparedness and Response, which led the pandemic preparedness activities; Scientific Advice, which hosts the coordination of the influenza work, and Surveillance). Working in support of Member States and the European Commission the ECDC initially developed a simple assessment procedure to strengthen pandemic preparedness in the EU and European Economic Area (EU/EEA) countries and started a series of visits (. Tab. 1). It did so in collaboration with the European Commission (EC) and the WHO Regional Office for Europe (WHO-EURO). At the same time the ECDC developed close linkages with WHO Headquarters since that has the main influenza expertise globally and leads the work of pandemic preparedness and response. The objective of this paper is to describe the ECDC assessment work and procedure, and how and why it evolved with experience with the EU/EEA Member States. 1 The paper also reports the resources that were developed for Europe from needs expressed in the assessments. It shows how momentum was maintained and European capacity drawn upon. The overall results of the assessment are described and finally it lists the important gaps identified as a result of the real test from the 2009 pandemic. The central component of the ECDC's work on pandemic preparedness was a standard procedure for assisting EU and EEA countries to assess and improve their national and local pandemic preparedness. This was based on WHO's 2005 guidance and especially its checklist but had to operate within the limited mandate of EU bodies in general and the ECDC in particular in relation to human health issues [11] . The ECDC has no regulatory function and can only issue guidance (that is documents on a topic offering options with their pros and cons, operational aspects and the relevant scientific evidence). It can rarely offer recommendations or directions. Also it can only become involved in a country's preparedness when invited. However with strong leadership from the European Health Commissioner (Mr Kyprianou 2005-2008), the European Commission and successive EU Presidency Countries (notably those of the UK, Austria, France and Sweden) all EU and EEA countries welcomed and supported the ECDC's approach to the extent of seeking an ECDC assessment. The initial assessment procedure was simply a short country visit using a set of indicators. This was developed by ECDC staff in its Preparedness and Response Unit in 2005 using the WHO planning guidance and especially its 2005 checklist as reference points [8, 9] . Following piloting with Swedish authorities the visits began in the summer of 2005 (. Tab. 1). The initial procedure was a classical external assessment relying on a visiting team (usually but not always led by a senior member of the ECDC or EURO staff but with some members from the EC and EURO). The team visited countries for 3-4 days, checking plans against the WHO templates and making a limited number of visits to a few convenient national institutions. A standard questionnaire was completed during the visit, based on the WHO check-list, and those questions became the first ECDC Indicators of Pandemic Preparedness. This followed the broad structure recommended by the WHO documents with the five WHO guidance categories: planning and coordination, situation monitoring and assessment, prevention and reduction of transmission, health system response and communication. Following the visit the ECDC would send a written report to the country which remained unpublished. The autumn of 2005 and the winter of 2005/2006 provided a stimulus for EU/EEA countries to prepare for a severe pandemic with human cases appearing in a neighborhood country (Turkey); birds in many EU countries were also then infected. Strenuous work to improve preparedness was led by WHO (human health), the European Commission and other animal health agencies (FAO and OIE) [12, 13] . Many countries secured stockpiles of antiviral agents, principally oseltamivir [14] . In response to this items on preparedness for avian influenza outbreaks were added to the ECDC indicators and joint work was sought with Ministries of Health and Agriculture during the visits. However"bird flu" was almost a distraction for ECDC in that it had to develop a suite of risk assessment and guidance documents for what was an animal influenza what could occasionally infect humans [15, 16] . While human infections with A(H5N1) often had lethal consequences to date no such infections have occurred in the EU/EEA countries and the virus has as yet failed to fully adapt to humans and transmit efficiently from person to person [16] . With hindsight this severe threat led most Member States, ECDC and WHO to focus on planning for a much more severe pandemic than transpired in 2009. Preparing for a severe threat was a very defensible first step but it should of course have moved on to planning for mild as well as severe eventualities [17, 18] . As the country visits proceeded a number of limitations in the 2005 procedure became clear and it rapidly evolved with significant revisions in 2006 and 2007 (. Tab. 1). In order to provide detailed guidance for the visits, ECDC (in collaboration with the European Commission and WHO EURO) led the further development of an assessment tool that addressed the planning process, discussion points for the meetings, and report writing. An internal guide was developed for team leaders. The tool evolved substantially during the first year of application and latterly became available in a version that focused more on local preparedness, intersectoral work beyond the health sector, consistency of preparedness with neighbouring countries (interoperability), measures around seasonal influenza, especially influenza vaccination, laboratory preparedness, antiviral strategies, exercises, and communication aspects [19, 20, 21] . The most important developments (. Tab. 2) and some of the resources will now be described in more detail. The most important development was to move from the classical short external assessment visit to a more demanding self-assessment owned and enacted by the country. The assessment To be effective risk prevention work takes place well before pandemics through the three Ps: Planning, Preparedness and Practise. Between 2005 and 2008 the European Centre for Disease Prevention and Control (ECDC) worked with the European Commission (EC) and the WHO Regional Office for Europe (WHO-Euro) to assist European countries in preparing themselves for a future influenza pandemic. All eligible countries in the European Union and European Economic Area participated with energy and commitment. Indicators of preparedness were developed based on WHO planning guidance and these were set within a simple assessment which included a formal country visit. The procedure evolved considerably with field experience. As the complexity of pandemic preparedness was appreciated it changed from being a classical short external assessment to longer national self-assessments with demonstrable impact, especially when self-assessments were published. There were essential supporting activities undertaken including a series of pan-European pandemic pre-paredness workshops organised by EC, WHO-Euro, ECDC and countries holding the European Union Presidency. The self-assessments highlighted additional work and documentation that was needed by national authorities from the ECDC. This work was undertaken and the document produced. The benefits of the self-assessments were seen in the 2009 pandemic in that EU/EEA countries performed better than some others. A number of the guidance documents were updated to fit the specific features of the pandemic. However the pandemic revealed many weaknesses and brought new challenges for European countries, notably over communication and vaccines, the need to prepare for a variety of scenarios and to factor severity estimates into preparedness, to improve surveillance for severe disease and to deliver seroepidemiology. Any revised self-assessment procedure will need to respond to these challenges. then became a collaborative effort with an in-country lead and all the national country agencies involved in influenza preparedness as essential players. It was also appreciated that the scope of pandemic preparedness was becoming more and more complex. While the visits remained central and still lasted only 4 days, they were part of a longer procedure lasting 4 or more months orchestrated by the country's lead agency for pandemic preparedness, usually but not always, the Ministry of Health (. Tab. 2, [19] ). Standard aims and ground rules were agreed (. Tab. 3) . The central visit always involved work with the Ministry of Health and the national public health agency but also variously Ministries of Agriculture, institutions involved in general civil emergency preparedness, the national influenza committee, the national influenza reference laboratory and the national surveillance centre. The visits also increasingly included some local representatives as the emphasis focused more on local preparations, including a visit to at least a set of local preparedness structures. The visits resulted in a report with a list of recommendations and actions prepared by ECDC and the national authority. Finally in the spirit of openness and to make it more likely that recommendations would be acted upon countries were latterly encouraged to publish their reports and five of the countries undertaking self-assessments did so (Finland, Ireland, the Netherlands, Norway and Sweden The list of key and subsidiary indicators also evolved to a point where they could be used by the countries and the EU to monitor their planning progress. A special list of indicators was developed for communication preparedness. A working group had been set up by WHO EU-RO to develop similar indicators and make them more SMART under a project funded by the European Commission, i.e. that the indicators were specific, measurable, achievable, realistic and timely. The ECDC was invited to join this group to ensure there were not competing sets of indicators. A proposed set of primary and linked secondary indicators were field tested during two of the assessment visits. They were found to be unacceptably complex to the in-country teams so that their completion was not -The procedure became one of enhanced self-assessment rather than an external assessment -The assessment tool is filled in beforehand by the key contact in the country (national team leader) often by breaking up and distribution to other in-country authorities. Hence time is efficiently used during the assessment visit itself -Countries identify ahead of time the issues they especially wish to focus on (reflecting the growing complexity of pandemic preparedness) and external teams are selected to reflect these needs -Increasing emphasis is put on local preparations and the operationalising of national plans at the regional and local level, especially in the health services -Much more attention is paid to work beyond the health sector with intersectoral work (i.e. with the non-health sectors, education, security etc) and with neighbouring countries (interoperability) -There is more emphasis on the measures around seasonal influenza and especially influenza vaccination -There is a more detailed section on laboratory preparedness at national and local levels -The communication aspects have been further developed -Preparations for an outbreak of highly pathogenic avian influenza are being emphasised in the light of the outbreaks in animals experienced in the EU -The key indicators have been adjusted to conform to the above developments and there are more subsidiary indicators indicating work that builds around these; the report places more emphasis on development of future work plans -Recommendations became more like SMART objectives allowing later audit (though this also slowed down the final acceptance of a report because of the resource implications) -Latterly the reports were published by the countries October Completion by ECDC of last national pandemic preparedness self-assessments-now done for all 30 25] . However it was also appreciated that plans could not reflect preparedness and there was a danger that national authorities stopped at producing a well-written pandemic plan without developing the operational aspects underneath. In November 2005 the European Commission carried out a pandemic exercise Common Ground involving every EU and EEA government, all relevant EU Agencies, WHO and the substantial European pharmaceutical industry (which accounts for a significant proportion of the global influenza vaccine production). The exercise was highly successful in that many countries combined the EU level event with national exercises [26] . These exposed many gaps and weaknesses at both the EU and national levels. This experience and further national exercises which were stimulated by Common Ground led to an appreciation that the assessments had to include Practices and that these were the third P joining Plans and Preparedness. It became standard in the assessments to request details of exercises that had been undertaken or to recommend them being carried out [19, 21] . An inherent weakness of pandemic preparedness is that though it has to start centrally in countries for the individual citizen the countermeasures have to be available and delivered locally [1] . If that is not the case the preparations can be viewed as a waste of time by the citizens. After the second European work- Aims 1. To self-assess their influenza preparedness using a common European Framework-with the main focus on pandemic preparedness-but addressing also protection against human seasonal influenza and preparedness against transmission of highly pathogenic avian influenza to humans 2. To determine baselines of influenza preparedness and response, or to determine progress made since any earlier assessments 3. To identify strengths of current influenza preparedness but especially to focus on areas where additional strengthening is needed and so make recommendations for future work 4. To identify areas where support from the ECDC or other partner agencies should be requested Ground rules 1. There is transparency between ECDC and the national authorities operating as a joint assessment team led by ECDC. Essentially it is a structured self-assessment by the country of the country but facilitated by and led by an external team (for EU and EEA countries the lead is by ECDC) 2. The assessment report will not be used to make comparisons between countries except where it is agreed that comparisons can be made, e.g. vaccine coverage 3. Examples of good practices and innovations are shared with other countries with the knowledge of the country concerned to encourage to make them public through the ECDC webpage 4. Reports and the results of the assessment process are not shared by ECDC with third parties without the written consent of the national authorities 5. Those people on the external team members who are not members of ECDC staff work to the ECDC team leader and do not share the findings with other members of their own organisation without the team leader's permission and the knowledge of the country authorities 6. Any contact with the media is by the country concerned shop it was appreciated that in addition to evaluating written plans, the self-assessments needed to determine the extent that they had been translated into preparedness at different levels in the countries, for example whether or not complementary standard operating procedures and contingency arrangements had been developed, staff educated and trained, equipment and supplies ordered and local business continuity planning undertaken. These facets were added to the procedure and indicators [20, 21] . Particular examples are the delivery of antivirals and vaccines. While a country may have acquired a national stockpile of oseltamivir and have an Advance Purchase Agreement for specific pandemic vaccines that will be to no effect if the plans are not there for and practised for getting the drugs and vaccines to local practitioners and then onto the public. To assist in this ECDC developed what it called acid tests 2 for countries to self apply to convince themselves that they could deliver countermeasures to citizens (. Tab. 5, [27] ). Initially European pandemic plans exclusively focused on the health care sector and the first ECDC procedure and indicators reflected this, Partially this was following the WHO lead which primarily concerned the health sector and did not contain a multi-sectoral component until 2009 following the creation of a United Nations Influenza Coordinator (UNSIC) [28] . The importance of nonhealth sectors was appreciated for two reasons. Firstly a pandemic threat could temporarily remove up to 20% of working adults through illness or having to care for others and so threaten the functioning of core activities well beyond the health sector. Pandemic business continuity planning was needed, for example the power industry, food and fuel supplies. Secondly while there was agreement that certain public health measures might constitute significant countermeasures to reduce peak illness prevalence those in the social distancing category required cross sectoral preparation action [29] . For example proactive school closures needed close coordination between education, health and other sectors at national and local levels [30] . Therefore in their updating the ECDC procedures increasingly required their health sector counterparts to involve other sectors, business and civil society. At the same time some EU countries started to publish cross government or whole society plans and eventually the French Presidency Angers Workshop in 2008 led to formal EU Health Council Recommendations on the need for multi-sectoral planning in pandemic preparedness [20, 21] . After the initial European pandemic preparedness workshops in Luxembourg and Copenhagen, two other workshops were held in May 2006 in Uppsala (. Tab. 1) and in September 2007 in Some suggested acid tests for helping assess, strengthen local preparedness for moderate or severe pandemics The idea of these acid tests is that those responsible for local services can use them to assess whether they can deliver what is expected of them in a crisis. They should be applied along with planning assumptions of 20-30% of staff being off sick for short periods (2-3 weeks) just when numbers of people seeking or requiring care increases considerably. 1. Can local services robustly and effectively deliver anti-virals to most of those who need them inside the time limit of 48 hours since start of symptoms? 2. Are there simple mechanisms for rapidly altering the indications for giving antivirals? 3. Do you have mechanisms for ensuring there are adequate supplies of antibiotics and other essential medical supplies (infection control materials, injection devices etc.) available or coming through if for a sustained period of increased need? 4. Have local primary and secondary care services identified what non-influenza core services they will sustain and what they will stop in the peak period? 5. Can local hospitals increase ventilatory support (intensive care) for influenza patients including attending to issues including staff training, equipment and supplies? 6. How would local funeral services deal with sustained increased demand over a prolonged period whilst still meeting reasonable family expectations including those of local faith groups? 7. Has business continuity planning been completed such that essential non-influenza related core health services have been identified and could be delivered with significant numbers of personnel being unavailable for work? Specifically a) Social care for vulnerable groups b) Supermarket supply and delivery at check-outs c) Fuel supply 8. Has it been agreed how local clinical, laboratory public health, social care staff will be paid for the increased working (overtime) that will take place over a pandemic and the basis of this work? Is it as volunteers, under contract etc? 9. If the intention is to close schools proactively or reactively to reduce transmission how will children be cared for so that they do not simply mix outside school? 10. Again if the intention is to close schools have parents been informed and asked what alternative arrangements they will make? 11. Once a pandemic vaccine (a vaccine that works against the new virus-it will not be ready for at least 4-6 months after the start of a pandemic) is available are there agreements made for determining who should receive the vaccine first? 12. Again when the pandemic vaccine becomes available are there arrangements made for its equitable and efficient delivery? This list is not intended to be complete and ECDC welcomes both comments on the current tests and suggestions for new tests such as lessons learnt from exercises. Comments should be sent to influenza@ecdc.eu.int Luxemburg. Alternating with the European level workshops, and following a recommendation of the Uppsala workshop, the ECDC started in 2006 to organize smaller, regional level and topic-based workshops. These workshops of up to 10 countries with, if possible, common borders, addressed specific operational issues within and among Member States. An additional topic-specific workshop on communications was also held in 2006 leading to the formation of a network of communicators under the Health Security Committee. As capacity developed in the EU increasingly ECDC and WHO-EURO (which was undertaking its own assessments of countries outside the EU/EEA area) drew on those countries that had undertaken early assessments to contribute to later assessments. The fourth and final European workshop 2007 in Luxemburg made a special feature of all the innovations from Member States and ECDC stated publishing these on its website [31] . The fact that the indicators were standard meant that it was possible to make comparisons of a country against a norm. At the request of the health commissioner two cross-sectional surveys were undertaken by ECDC with EU/ EEA Member States in 2006 and 2007 asking them to indicate their performance on the indicators. A sensitive question has been whether performance on preparedness indicators should be centrally monitored in the EU? A number of Member States made it clear to the ECDC that the country-specific results should only be known to the country, that they should not pass beyond the ECDC and that specifically there would be no"league tables". This followed an earlier academic exercise when preparedness plans found on the web were analysed and countries were place in a "league table" without validation or informing the countries [24, 25] . Instead in 2006 the ECDC used the then indi-cators to gather country-specific data but did not publish individual country results. Rather normative data were passed back to countries so that they could determine how they compared to other EU/EEA countries. All countries participated in a repeat of the survey using a core of the same indicators in 2007, which allowed the ECDC to determine to what extent countries' preparedness had improved after 12 months of work [32] . Member States were also shown the report for comment before publication and the ECDC made adjustments when suggestions and requests were received. Hence the ECDC was seen as an honest"broker". The combined results revealed that much progress was being made while at the same time there were a number of weaknesses especially in the fields of delivering preparedness locally and multi-sectoral work. This led to the development of a standard"threedimensional" model of pandemic preparedness [22] . In preparation for an intended repeat survey in 2009 (Member States had asked for a moratorium on surveys in 2008) the EU Health Security Committee (the EU body with oversight of pandemic preparedness) pointed out that if Member States were going to be judged by the indicators they ought to approve them. A process of consultation was undertaken with the HSC Influenza Section members leading to formal publication of the indicators, though not until after the 2009 pandemic [22] . The 2009 pandemic provided a real test of pandemic preparedness. While there was generally a strong response to a pandemic that itself was almost optimal for Europe [18, 32] it is also agreed that many weakness were revealed and lessons needed to be learnt [17, 33] -perhaps the most important being to prepare for different kinds of pandemics, especially around that elusive parameter"severity" [34] . The last European status report on pandemic preparedness in 2007 [30] had concluded that, in spite of the progress found, a further 2-3 years of sustained effort and investment was needed by the EU and its Member States to achieve the level of preparedness needed to respond well to a pandemic. That can now be seen to have been an optimistic estimate. Dealing well with the challenges of any pandemic requires complex adjustments that may be very different from the experiences in other crises. Most problems occurred in stress on hospital intensive care and paediatric services, around risk communication, vaccination, maintaining the confidence of the professionals and delivering local interventions. These cannot be measured with indicators alone and arguably the ECDC"acid tests" are the most useful of the indicators on hand ( [27] , . Tab. 5). Consensus on the surveillance difficulties were agreed at a meeting of the European Influenza Surveillance Network (EISN) in 2010 [34] . These identified difficulties in establishing surveillance in hospitals, undertaking seroepidemiology, estimating severity and sharing of analyses. Finally it is important to appreciate that the 2009 pandemic was unusually benign (it had a low case fatality rate and did not stress essential services outside the health sector) [17, 18, 33] , i.e. countries should not neglect preparations for more severe pandemics. The commitment to visit all 30 EU/ EEA Member States in 2 years placed a considerable strain on the ECDC and its partner organizations. They will not have the capacity to re-visit the countries frequently, and follow-ups have been so far limited to addressing progress in pandemic preparedness during visits for other reasons. Therefore, it remains challenging for the ECDC and partner organizations to assist the Member States over a longer term in keeping the momentum in strengthening preparedness though there is a Council Conclusion to do so which ECDC will support [35] . However countries will need to do more self-monitoring and assessment in the future, supported by the new indicators, and complemented by annual regional meetings for direct communication about their activities, and continued, but less frequent assessment visits. The natural"kick-off " for this in Europe was the Belgian EU Presidency meeting in July 2010 and the Council Conclusions though in parallel it is hoped that there will also be a new WHO guidance developed after the 2011 World Health Assembly and the report of the Fineberg Committee [35, 36] . Introduction to pandemic influenza Modular texts Influenza A (H5N1) in Hong Kong Special Administrative Region of China. WER WHO Influenza Pandemic Plan. The Role of WHO and Guidelines for National and Regional Plan Global agenda on influenza adopted version Resolution prevention and control of influenza pandemics and annual epidemics WHA World Health Organisation International Health Regulations World Health Organization. WHO Global Influenza Preparedness Plan Communication from the Commission to the Council, the European Parliament, the European economic and social committee and the committee of the regions on pandemic influenza preparedness and response planning in the European Community Establishing A European Centre For Disease Prevention And Control, Official Journal Of The European Union 30.4 Responding to the avian influenza pandemic threat. Recommended strategic actions. WHO, Geneva September Influenza antiviral susceptibility monitoring activities in relation to national antiviral stockpiles in Europe during the winter 2006/2007 season ECDC Avian Influenza Portfolio (collected risk assessments and guidance for member states Learning the right lessons in Europe from the 2009 pandemic Initial reflections on pandemic A(H1N1) 2009 and the international response ECDC Assessment tool for influenza preparedness in European countries -with a main focus on pandemic preparedness ECDC Pandemic Preparedness Self Assessment Indicators Summer How prepared is Europe for pandemic influenza? Analysis of national plans Progress and shortcomings in European national strategic plans for pandemic influenza WHO Bulletin. E-print ahead of publication Common Ground: a pandemic influenza simulation exercise for the European Union Some suggested 'Acid Tests' for helping assess, strengthen local preparedness for moderate or severe pandemics WHO Whole of Society Pandemic Readiness Closing schools during an influenza pandemic: A review ECDC Pandemic preparedness in the EU/EEA Status Report Autumn EU Belgian Presidency Conference Health security: lessons learned from the A(H1N1) pandemic in 2009 -better management of future health threats European Influenza Surveillance Network Annual Meeting Council conclusions on Lessons learned from the A/ H1N1 pandemic -Health security in the European Union How will the global response to the pandemic H1N1 be reviewed) (the Fineberg Committee Konzepte der Gesundheitsberichterstattung in der Diskussion.