Saltar al contenido principal

The EU experience in the first phase of COVID-19: implications for measuring preparedness

Países
Italia
+ 4
Fuentes
ECDC
Fecha de publicación
Origen
Ver original

This report analyses experiences of five EU countries - Croatia, Finland, Germany, Italy and Spain, from the beginning of the pandemic up until COVID-19 vaccines became available at the end of 2020.

The report focuses on testing and surveillance, healthcare sector coordination, and emergency risk communication and identifies specific challenges that were experienced in these areas, as well as successful responses to them.

Implications for measuring preparedness are identified to inform outbreak preparedness efforts in the EU Member States in the future.

Executive summary

In light of challenges experienced in the COVID-19 crisis, there has been a review of the European Union (EU) legislation to strengthen the EU’s collective preparedness to respond to communicable disease threats in the future. The Decision 1082/2013/EU for serious cross border health threats is revised into a Regulation, which is to be adopted in the autumn of 2022. The ECDC mandate is also revised and will enter into force once the Regulation for serious cross border health threats is adopted and published in the EU official journal. The measurement and assessment of the performance of public health emergency preparedness (PHEP) systems is a key component to the process of strengthening preparedness.

This technical report presents an analysis focusing on three issues (testing and surveillance, healthcare sector coordination, and emergency risk communication) during the first phase of the COVID-19 pandemic. The analysis identifies specific challenges that were experienced in this phase, as well as successful responses to them. The implications for measuring preparedness are also identified in order to inform outbreak preparedness efforts in the EU Member States in the future.

This analysis is based on the experiences of five countries (Croatia, Finland, Germany, Italy and Spain) during the first phase of the pandemic, i.e. before the initiation of vaccination programs in December 2020. It draws on:

  1. pandemic preparedness and response plans, standard operation procedures and other documents related to COVID-19 response measures provided by the countries,
  2. interviews of country representatives, and
  3. other literature identified through the conduction of rapid literature reviews.

The analysis identifies the following overarching issues with existing measurement systems for preparedness:

  • The COVID-19 pandemic required EU Member States to develop new strategies, approaches and policies related to their PHEP systems and structures under pressure. These also had to be reviewed and revised as the pandemic evolved. The extent of revision and innovation required was not contemplated in existing measurement tools for preparedness.
  • Existing measurement tools for preparedness are generally not consistent with a country’s internal hierarchical structure of public health, healthcare, and other entities that influence emergency responses.
  • Existing measurement tools for preparedness generally do not reflect the required coordination among different sections of the healthcare system, particularly at the hospital and community-based levels.
  • Existing measurement tools for preparedness generally do not allow for adequate flexibility and resilience required to address the challenges of scaling up a country’s pandemic response.

Section 3.1 builds on these overarching themes with specific indicators of issues missing from, or not adequately covered in existing preparedness measurement systems, particularly, the ECDC Health Emergency Preparedness Self-assessment (HEPSA) tool and the WHO Joint External Evaluation (JEE) tool, and for some parts, the Global Health Security Index (GHSI). The following conclusions are arrived at:

  • An indicator should be included in the measurement tools for preparedness referring to the ability to conduct testing at scale which was critical in the early phase of the pandemic.
  • An indicator for surveillance system flexibility should be introduced in the measurement tools for preparedness.
  • In fact, the existing measurement tools for preparedness related to testing and surveillance cover the main tasks but do not address the ability of systems to scale up testing capacity, the importance and complexity of sub-national structures for surveillance and epidemiological investigation, or the challenges of adapting existing surveillance systems and developing new ones during the pandemic. These need to be addressed with accurate indicators.
  • Although three capabilities in the ECDC PHEP logic model (management of medical countermeasures, supplies and equipment; medical surge; and hospital infection-control practices) proved to be critical, they are not represented with respective indicators in existing measurement tools for preparedness.
  • The ECDC PHEP logic model ‘preventive services’ capability should include a new broader capability on the ‘coordination of population-based medicine’, defined as ‘the ability to activate and strengthen coordination at a given geographical territory – during an outbreak of a high impact infectious disease – public health, outpatient care, including primary care services, mental health and social support agencies, public and private sector and inpatient health care, using integrated pathways between different levels of care (out- and inpatient).’
  • The experience during the response to COVID-19 proved that the risk communication capabilities identified in the ECDC PHEP logic model are valid and relevant, but are not completely represented in the existing measurement tools for preparedness. In addition, countries experienced difficulties in managing an epidemic of information, which meant that the logic model should be further expanded to include a fifth capability, ‘infodemic management’, i.e. dealing with an overabundance of information (some accurate and some not).

In summary considering different existing preparedness measurement systems, the analysis in this report suggests that the type of measurement approach and the format utilised in the Joint External Evaluation process, for example, might be useful in the assessment of the EU’s preparedness efforts. This would involve first developing a set of measurement tools and indicators to address the areas identified in the analysis as not so well developed, and then to create a scoring system or scale for each domain. As with the JEE process, the assessment would begin with an analysis and a preliminary scoring by national experts. This would be followed by a meeting in which peers from other countries review documentation from the internal analysis and meet with national experts to achieve consensus about scoring. The evaluation process could include an analysis of existing systems, performance during the COVID-19 pandemic, and the ‘stress tests’ mentioned in proposed EU legislation on health emergency preparedness and response.