Skip to main content

Responding to COVID-19: an exploration of EU country responses and directions for further research

Abstract

Background

During COVID-19, scientists advising policymakers were forced to deal with high uncertainty and risks in an environment of unknowns. Evidence on which policies and measures were effective in responding to the pandemic remains underdeveloped to answer the key question ‘what worked and why?’. This study aims to provide a basis for studies to go further to answer this critical question, by starting to look efficacy or how countries ensured that health services remained available and what measures were enacted to protect and treat their populations and workers.

Methods

We applied a three-phase sequential mixed methods design. In phase one, we started with a qualitative content analysis of the EU Country Profile reports to retrieve and analyse data on COVID-19 responses taken by 29 countries in the European region. Phase two is the step of data transformation, converting qualitative data into numerical codes that can be statistically analysed, which are then used in a quantitative cross-national comparative analysis that comprises phase three. The quantifying process resulted in a numerical indicator to measure the ‘response efficacy’ of the 29 countries, which is used in phase three’s association of the response measure with country performance indicators that were derived from European Centre for Disease Control (ECDC) COVID-19 case and death rate data.

Results

Through comparing the frequency of COVID-19 measures taken, we found that many countries in the European region undertook similar actions but with differing effects. The cross-national analysis revealed an expected relationship: a lower COVID-19 response efficacy appeared to be related to a higher case and death rates. Still, marked variation for countries with similar response efficacy indicators was found, signalling that the combination and sequence of implementation of COVID-19 responses is possibly just as important as their efficacy in terms of which response measures were implemented.

Conclusions

Many European countries employed similar COVID-19 measures but still had a wide variation in their case and death rates. To unravel the question ‘what worked and why?’, we suggest directions from which more refined research can be designed that will eventually contribute to mitigate the impact of future pandemics and to be better prepared for their economic and human burden.

Peer Review reports

Background

The COVID-19 respiratory viral illness was first identified in China in December 2019 and declared a pandemic by the World Health Organisation (WHO) on the 22nd of March 2020 [1]. Many countries implemented a range of public health measures aimed at reducing the illness’s spread and limiting its effects on their population, economy, and health system [2]. However, many of these measures could not be enacted careful and consistently [3]. During this period scientists were dealing with not only unknowns regarding this novel virus, but were also forced to develop advice for policymakers and were challenged to quantify risks and uncertainties while enacting their existing pandemic plans. These plans’ performances may or may not have reflected their country’s pre pandemic Global Health Security Index ratings [4]. At the same time scientific advisors were receiving high pressure and criticism from politicians, the mass media, and the public, to provide certainty and to find a way for ‘returning to normal’ through quick fixes [5].

The management consulting industry suggests that when confronted by crisis or extreme uncertainty a business’ traditional management operating models are seldom adequate, and organizations with poor or inadequate processes can find themselves facing a survival challenge [6]. Disaster and crisis management is also situated in a complex stakeholder environment, necessitating simultaneous coordination across country, provincial and local levels [7]. However, the COVID-19 crisis’ uncertainty was not related to a single event such as a hurricane or a nuclear reactor accident [7], it was global, of long duration, with compounding effects [6]. As such, even with recommended crisis preparedness actions such as assessments and plans, training, simulations and exercises [8], these preparations became less reliable during the pandemic, requiring new models of crisis response and organisation [6, 9].

Thus, national governments ended up following a variety of policy paths and policy mixes, leading to policy packages that included different combinations of tools [10] that, when combined, acted in unison to impede case rate expansion [11]. Adding to response complexity are the administrative traditions of each country, their political settings, whether the nation has a federalist or a centralist political system and the nation’s leaders’ style [10]. National culture [12,13,14,15] and a nation’s social capital, social connectedness [16], levels of trust, information diffusion and uptake [17,18,19] have also been found to have an influence on the pandemic’s spread and on the efficacy of prevention and mitigation measures enacted [20]. Thus, the combinations of measures implemented by countries rely on both the governance of human activities and the efficient management of a nations limited resources that can be maximized by the use of information technologies when responding to changing conditions [21].

As the pandemic progressed, much of the likely to be helpful data became increasingly spread across a range of databases or scattered across many published articles and reports. A quick scan of Google Scholar using the search term ‘ “COVID-19” AND “Country” AND “Responses” produced approximately 461,000 items that have been published by mid-June 2023 since the pandemic’s onset. This large and fragmented number of publications recount country responses across a range of publication types such as peer reviewed articles, country cases, policy or governance scoping studies, reports, or rapid reviews. So, while research and reporting created a data rich environment on COVID-19 responses, there appear to be few meaningful translations into policies and practices being produced to help to relieve the pressures and uncertainties that were being experienced by scientists and policymakers. Table 1 provides examples of the range of study topics and scope from the literature scan conducted.

Table 1 Topics and scope of pandemic publications

An early move to curate the disparate EU country pandemic response information was made by the European Observatory on Health Systems and Policies, which created the COVID-19 Health System Response Monitor (HSRM) platform [53]. The HSRM is a publicly accessible online resource that contains organised information about European health system COVID-19 responses, to assist policy makers access data on what was occurring by and across countries and issues [47]. Concurrently, the European Commission’s 2021 editions of State of Health in the EU Country Profile reports [54] specifically contained information on each country’s COVID-19 responses. Since the beginning of the pandemic, the European Centre for Disease Control (ECDC) was an authoritative data source collecting and allowing access to COVID-19 country case and death rate statistics.

Many countries had pandemic response plans developed from similar views of emergency management. These plans contained a similar range of response measures that aimed (like in previous crisis situations) to address an expected increased demand for health services, while a simultaneously managing an expected decrease of available health workforces [50]. The differences in how these pandemic response policies and measures were implemented resulted in inconsistent outcomes [51]. Like previous emergency management experiences, the EU countries’ COVID responses were implemented rapidly without substantive evaluation strategies making it difficult to determine their effectiveness [50]. Following the pandemic’s wane, lessons are being identified on topics that include improving health system resilience [55], for better public health responses to future pandemics [56], and to ensure the pandemic failures and successes are not forgotten and lessons learned are reflected in new pandemic plans [10, 57].

As such, with the fragmented literature, unevaluated pandemic response implementation and separated data sources it has been difficult to compare how countries actually ‘performed’ in their COVID-19 response – i.e., how they ensured that health services remained available and how they protected and treated their populations and workers. In addition, it remains difficult to identify the specific actions that countries took and to determine what mix or sequence of COVID-19 actions have proved to be more successful than others, with studies such as Bollyky et al., [17] being unable to explain most of the cross-country variation in cumulative infection rates or infection-fatality ratios and taking into account Dussauge-Laguna’s caution that exploring a combination of factors is methodologically challenging due to “the multiple issues involved in understanding how countries reacted to the crisis and how successful they were in their responses across time” ([10], p. 126). Based on these notions, this study aims to add to the literature on this topic to further clarify pandemic responses by re-addressing the complex but key question: what policies worked and why during the COVID-19 pandemic among countries?

Methods

Taking an incremental approach, we combined different databases and sources, and applied a mix of analytical steps to create a cross-national comparison that takes us past purely descriptive presentations of EU pandemic responses. In this way, we utilise the power of mixed methods to provide research directions from revealed patterns, rather than presenting final answers, and to progress knowledge on the above key question.

To reveal the potential patterns in disparate data, an exploratory sequential mixed methods design is used. This is an appropriate methodological choice when one data source is insufficient to describe macro-outcomes from complex processes such as country policy responses, and when the results from earlier research require further explanation [58]. This design is also appropriate when research requires multiple approaches concurrently (or in sequence) to reach conclusions, find meaning and solve complex problems [59]. The exploratory sequential design (see Fig. 1) begins in a stage that prioritizes qualitative data collection and analysis followed by a quantitative feature, which may be the generation of new variables, followed by a third stage where the new feature is quantitatively tested [58]. Result interpretation is based on how the quantitative results may add to the initial qualitative results or how the quantitative results can provide clearer understandings of these data [58].

Fig. 1
figure 1

The study’s mixed methods design. Source: Adapted from Creswell and Plano Clark, 2018 [58]

Our design follows the process outlined in Fig. 1. First, we applied content analysis to qualitative (QUAL) data from the EU Country Profile reports in phase one. Next in phase two, we used data transformation (QUAL to QUAL) by quantifying the qualitative country data to develop a ranking indicator. And finally in phase three, we merged this with the quantitative (QUAN) ECDC COVID-19 country case and death rate data to enable country comparisons and analysis.

Critical in this design is phase two, the step of data transformation i.e. the process of ‘quantitizing’, which is the conversion of qualitative data into numerical codes that can be statistically analysed [59]. According to Sandelowski, Viols and Knafl, ([60], p.208) this “has become a staple of mixed methods research”. The practical benefits of quantitizing are in aiding pattern recognition and resolving meaning ‘mix ups’, while the practice has been criticised for the assignation of numbers having no objective anchoring, that counting removes the phenomena under study from their contexts making them more (questionably) definable, which is in opposition to the uncountable and indivisible qualitative research traditions of data, and, that once data conversion is accomplished, the work of conversion disappears along with the researchers’ judgments about these data and their representation [60]. Regardless, the process of data conversion is said to aid to reveal diversity within qualitatively derived data and support its presentation [61].

Data collection

The country COVID-19 response data

The data source for the content analysis were the 29 publicly available 2021 EU Country Health Profile reports [37]. These reports, prepared by experts from the Organisation for Economic Co-operation and Development (OECD) and the European Observatory on Health Systems and Policies are based on a conceptual framework that encompasses the EU Commission’s objective to sustain effective, accessible, and resilient health systems, with each report containing short statements on the report’s drafting, access to advice from external expert and EU agencies and committees and its data types, source consistency and transparency, and how these are flexibly adapted to each country’s context [54]: all of which aids source reliability and confidence and reveals them as cross-nationally comparative and highly trustworthy documents. For the 2021 edition, each member Country Profile report’s Section “5.3 on Resiliency” described the actual policy responses since the COVID-19 pandemic outbreak, uniform in structure and consistent as to its reporting.

For our analyses, we focused on the paragraph headings of Section  5.3 as the key information object to identify and then code each country’s COVID-19 policy responses, measures, and trends. After full text analysis of all 29 Country Profile Section  5.3, we concluded that the paragraph headings concisely summarised the actions taken, including the effects of the COVID-19 policy responses as experienced by the countries and therefore we used only these paragraph headings for our data extraction. We then coded these paragraph headings from every Section 5.3 of the 29Country Profile reports for further analysis (see Section “Analysis of the country response COVID-19 data” below). These procedures align with, Mackieson, Shlonsky and Connolly’s position that a purposeful methodology using highly trustworthy documentary data sources reduces qualitative research reporting and analysis biases [62].

The country COVID-19 statistical data

The second source used was the country data in cases and death rates retrieved from the public ECDC portal, for the same 29 countries and over the same period that the 2021 EU Country Profile reports covered – i.e., from spring 2020 (as merely the start of the pandemic) until winter 2021. Section “Analysis of the country Covid-19 statistical data” below further describes how we handled and analysed the ECDC data.

Data analysis

Analysis of the country response COVID-19 data

The country response information retrieved from the EU Country Profile reports (i.e., the “Section  5.3” paragraph headings) was processed using thematic analysis. We applied a method where the heading texts were coded into recurring thematic groups, first inductively (from the content) and then deductively (applying a specific frame, see below) [63]. One of the authors began the process inductively, coding the paragraph headings into distinctive themes documenting the COVID-19 responses. Next, the other two authors reviewed the coding and designed a coding frame for the deductive phase [64]. Upon completion, the coding frame was discussed among the authors for consistency and the codes were finalised at a group meeting.

The Country Report’s 5.3 paragraph headings were assessed and coded to indicate a countries’ COVID response ‘efficacy’. The frame for deductive coding consisted of a first section of three coding categories that were used to describe the measure’s efficacy:

  1. (i)

    the paragraph headings describing a country response as ‘successful/positive’ (i.e. the text described an advance, success, using a positive tone),

  2. (ii)

    the paragraph headings describing a country response as ‘neutral’ (i.e. the text described neither an improvement, enhancement, or development nor a failure, reduction of effectiveness, decrease or decline), or

  3. (iii)

    the paragraph headings describing a country response as ‘failure/negative’ (i.e. the text described a failure, a setback, result less than intended, using a negative tone).

Then, a section was added to the code frame to classify the type of COVID-19 response taken by the countries. This deductive thematic coding was undertaken by the corresponding author, a native English speaker, who used the above efficacy categories as a guide to classify and code the paragraph headings as positive, neutral or negative. Finally, the efficacy coding was then subjected to an inter-coder reliability test to assess levels of agreement or reaching the same conclusions [65]. For this, the remaining two authors each test coded three different countries; an approximate ten per cent sample of the coded texts each. These test coding’s were then assessed using the online coding agreement calculator ReCal2, which provides calculations of coder percentage agreement and four other commonly used reliability statistics that account for agreement by chance in their formulae [66]. The ReCal2 results provided an average coding percentage agreement of 93.32% with the other reliability statistics indicating substantial (0.60 to 0.80) or nearly perfect (0.81 to 1.00) inter-coder agreements [67]. Table 2 presents the study’s inter-coder reliability results obtained from the ReCal2 website.

Table 2 Inter-coder reliability results

These reliability results are consistent with Julien’s view that reliability coefficients of not less than 60% are acceptable for qualitative content analysis due to the methods’ interpretative nature [68]. Following the reliability testing, the authors discussed the test results, identified, and resolved inconsistencies and updated the coding frame, which was then applied to update the rest of the coded paragraph heading texts.

The validated coding results were finally used to develop the efficacy indicator for country comparison, standardizing and enabling the ‘quantitizing’ process. The efficacy indicator was calculated using the formula derived specifically for this study:

(number of positive codes—number of negative codes)

______________________________________________

total number of codes

Producing a ratio between 1 and -1 for each country. A ratio of 1 indicates that 100% of the paragraph headings in a country profile were coded positively (i.e., all described the COVID-19 responses as an advance, success, or the use of a positive tone), while a ratio of -1 indicates that 100% of the paragraph headings were coded negatively (i.e., all described the COVID-19 responses as a failure, a setback, result less than intended, or the use of a negative tone). In this way, the indicator enables a comparison of the ‘relative efficacy’ of the COVID-19 responses applied by the countries.

Analysis of the country Covid-19 statistical data

From the extended and detailed data available in the ECDC portal, we selected two main indicators: (1) the 14-day notification rate of reported COVID-19 cases per 100, 000 population, and (2) the 14-day notification rate of reported deaths per 1,000,000 population. We consider these two as the main ‘performance’ or outcome indicators related to the COVID-19 pandemic, as they are published as key indicators in the ECDC portal for which data was collected (and hence are available) for all countries and all periods/dates. Numbers from the ECDC portal were downloaded and processed in Excel, averages were calculated over the period spring 2020—winter 2021 for each country – the same period the Country Profile reports covered. The selected ECDC data was then merged by country and period with the relative efficacy country scores as derived from the Country Profile report data (see above). This integrated country dataset was also analysed in Excel, to produce scatterplots and correlation calculations that are described in the next section.

Results

The results of our thematic coding resulted into 15 different codes presented in Table 3, representing the different types of COVID-19 responses as summarized in the 29 Country Profile Section  5.3 paragraph headings. The (generic) code “measures in general” is the most frequent, representing heading texts which expressed that some (but not specified) form of response measure was undertaken by a country. This ‘catch all code’ indicates a general measure of a country’s actions while the other codes express more specific measures. Of these more specific measures, “testing population” and "vaccination” are the most applied measures included in the paragraph headings; with other frequent measures concerned with managing case rates and access to treatment facilities. Moderately frequent codes include “data/information” and “preparedness”, other codes addressed long-term care that was considered a risk area for COVID-19 infections, or address “primary care”, which was mobilised to reduce pressure on hospitals and to effect home-based treatments and vaccination programmes. The least frequent measures included in the paragraph headings are “PPE access” and “e-health/tele health”, regarding using information and communication technologies to reduce patient contact interactions, and “testing workforce”. Table 3 also shows the response measure code count by frequency, our coding (positive/neutral/negative) and the calculated efficacy indicator (see 2.2.1).

Table 3 Code count of 15 different COVID-19 response measures, extracted from the 5.3 paragraph headings in 29 Country Profiles (2021), frequency, allocated efficacy score and relative efficacy indicator score, by measure

The efficacy indicator scores in the last column of Table 3 reveals that some COVID-19 response measure categories were coded as more positive than others, across the country sample. Codes regarding “tele health/e-health” had a (relative) high and positive efficacy result, as did “primary care”. “Pandemic response and progress” (rated 0.00) and the category “shortages” code received (as could be expected) a negative efficacy indicator score. “Measures in general”, the most frequent (but also neutral) code, was rated 0.50. This indicates an even divide of the effects of the general measures or actions taken, and also infers variation in terms of the outcomes and/or implementations of the range of measures taken by the sample countries.

Next, Table 4 provides the country scores sorted by code count, showing the frequency, average coding result and the efficacy indicator scores for the 29 European countries included. The frequency of code counts per country varied across the countries with the highest count recorded by Estonia and lowest by Belgium – with a mean count of 9.72 and median of 9. Comparing this rank ordering with the efficacy indicator scores, reveal that the countries with the highest number of measures taken (i.e., coded) do not necessarily have the highest efficacy scores. Examples are the Netherlands, Slovakia Sweden and Poland, countries for which over 10 different types of COVID-19 response measures were coded, but have a relative efficacy indicator score close to 0.

Table 4 Code count, frequency, and efficacy indicator score of 15 different COVID-19 response measures, extracted from the 5.3 paragraph headings in 29 Country Profiles (2021) by country

While these descriptive results are interesting, they do little to reveal patterns associated with various policy responses applied during the COVID-19 pandemic. To resolve this and to add depth to our analysis, we created scatterplots to analyse the correlation between the country COVID-19 case and death rates.

Figure 2 presents the first scatterplot for the association between the countries’ efficacy score and its COVID-19 case rate per 100,000 people, while in Fig. 3 the efficacy score is associated with each country’s COVID-19 death rate per 100,000 people. Both figures include a line of ‘best fit’, drawn to illustrate the expected negative relationship that appears to be confirmed: the lower the countries’ relative efficacy scores of the COVID-19 policy responses they took, the higher their case and death rate per country. Still, the two scatter plots also reveal that countries with similar efficacy indicator scores can have considerable variation of their case or death rates. This pattern of variation is firstly discernible in the group of high efficacy countries (i.e. the upper part of the scatterplot) where their case or death rates are quite spread across the x axis. This pattern also applies for the countries above and below the line of slope, i.e. the other approximate high middle, low middle, and low efficacy score country groupings.

Fig. 2
figure 2

Relative efficacy indicator score (based on our coding of all COVID-19 policy response taken in country during 2020–2021) by COVID-19 case rates per 100,000 (based on ECDC data, 2020–2021) in that same country, for 29 European countries

Fig. 3
figure 3

Relative efficacy indicator score (based on our coding of all COVID-19 policy response taken in country during 2020–2021) by COVID-19 death rates per 100,000 (based on ECDC data, 2020–2021) in that same country, for 29 European countries

Discussion

To the best of our knowledge, our study is the first to measure the efficacy of COVID-19 responses of European countries from the beginning of the pandemic until the end of 2021, and to associate this with each country’s COVID-19 case and death rates. Our efficacy indicator is based on coded paragraph headings in the EU Country Profile reports, by which different levels of ‘success and failure’ of COVID-19 measures taken by countries are quantified. There were large variations of country responses and the number of COVID responses (according to the first analysis in Table 3) do not seem to directly correlate with their efficacy. In addition, our cross-national analyses show that the variation of countries in terms of their COVID-19 case and death rates is marked, while they share similar efficacy indicator scores. We conclude that the (expected) negative association between the efficacy of COVID-19 response of countries, and their COVID-19 case and death rates as outcome indicators, can be observed but it is not clear-cut or evident.

For example, the four countries that have lowest efficacy scores are Slovakia (-0.80), Hungary (-0.13), Latvia (-0.13) and Sweden (-0.09), two countries with middling scores are Cyprus (0.50) and Belgium (0.50), while those with high efficacy scores are Denmark (0.89), Germany (0.89) and Greece (0.91). However, those countries with the highest code counts are Estonia (19), Norway (14) and Croatia (12) and Netherlands (12), while the lowest five code count countries are Portugal, Austria, Ireland and Iceland all with seven codes and Belgium with six codes.

Our results therefore suggest several directions for future studies, to further knowledge on the key question ‘what (during the COVID-19 pandemic) worked and why?’.

Firstly, the results suggest that it is relevant to explore if a combination of policies, when achieving the desired effect, enabled some countries to respond better than others. The results seem to indicate that not ‘the more, the better’ – rather there is a function of activity i.e. when actions began and the application of lessons as time progressed. This idea of combination aligns with Dussauge-Laguna’s [10] and Kaimann and Tanneberg’s [11] observations and as such, we suggest that the countries that achieved greater success in their COVID-19 measures should be examined in more detail to reveal if and how their policy combinations worked (or didn’t) for them. This will contribute to a better understanding of the critical factors in controlling the COVID-19 cases and the resultant death waves that affected all countries from the beginning of 2021. In addition, comparison of these apparent success factors in those countries that were less successful (i.e. had lower relative efficacy indicator scores) can also be undertaken, providing contrasting data on how success is achieved. Such lessons, if uncovered, are useful for countries to adopt in preparation for future pandemics or for their health system strengthening policies in general, particularly if they include greater investments in risk communication and community engagement strategies that aim to increase individuals’ confidence in public health guidance [17].

Secondly, we aggregated different types of policies or actions in our efficacy indicator score but recognize that some had more negative codes than others. Still, it is not evaluated how the actual implementation may have influenced this. For example, the measures we coded as ‘tele health/ e-health’ and ‘primary care’ appear to have high ratings on efficacy, but fewer countries seemed to have implemented these. This finding for “tele health/ehealth” appears to be counter intuitive as for many countries the pandemic promoted wider use of tele health consultations [69]. Reviewing the code data, we find it is those countries establishing these measures or extending digital health infrastructures that are included in this code, rather than the indication of increased telehealth use or consultations. While not fully explaining this anomaly, it does reveal one of mixed methods’ limitations: that counting qualitative data risks removes data from its context and may obscure meanings [59] and an example of the methodological risks pointed out by Dussauge-Laguna for this type of study [10]. Higher count measures with regard to testing, tracing, vaccination, and ICU beds also reveal that these measures variously ‘underperformed’, i.e., might not have been implemented effectively. As much has been learned about the responses to the COVID-19 pandemic measures on ICU capacity can be expected to not be such an issue in the future, ‘tracing’ measures creates concerns both in terms of development costs and contributions to lowering case rates. This variability of COVID-19 measure effects suggests that research into the mobilisation of response measures and their implementation methods can lead to better knowledge – in particular, which of the measures produced the positive outcomes, in what contexts and how and why. Additionally, by taking a more targeted approach those populations affected by cultural priorities or adherence dispositions may be addressed [12, 14, 15], as would an analysis of the nature of government systems and orientations of government control [10]. Moreover, further case studies are needed to research how successful technology programs were developed, operated, and accepted. Not just in terms of the technology itself, but also in human terms – i.e. providing better insight into how, for example, tracing using technology in pandemic situations can be streamlined and enabled more effectively, to support the other public health measures being applied.

Thirdly, this study suggests the importance of timing or sequencing of measure implementation. For example, The Netherlands had relatively many response measure codes, but similar numbers of positive and negative experiences. This suggests that while The Netherlands put in significant effort to implement their COVID-19 responses, some of the measures failed to achieve their intended results. For example, its six positively coded measures five “preparedness” (2), “IC beds” (2) and “primary care” are associated with early pandemic responses, with the sixth code being “vaccination”, and, its five negatively coded measures being “data/information”, “face mask/PPE”, “tracing population” (2) and “testing population” appear to be more related to measure continuation or extension of measures as the pandemic progressed. This interpretation is supported by a report on the Netherlands COVID-19 response, which determined that response measures involving the behaviour of the population become less effective as the pandemic continued [70]. Thus, this insight poses a new question to be addressed: is there a ‘optimum window of opportunity’ of COVID-19 responses that can be explained from a longitudinal perspective? By reviewing the sequences of measures (i.e. when they were implemented) the relationships between these and a sense of the ‘critical paths’ for pandemic and health crisis planning can be better explored. With the significant amount of financial and human resources mobilised to respond to COVID-19, it seems sensible that a better understanding of the measures and informed decisions taken (and their response planning and implementation) will eventually contribute to how response measure sequencing can be assured.

As indicated in the introduction, there is a growing body of literature on the pandemic’s effect on the health workforce, including its knock-on effects that impact whole health systems (for example see [71]). Thus, further work on identifying which measures specifically reduced deleterious workforce effects, and what responses actually assisted health workforce recovery, are of key importance for future workforce policies, preparing a more resilient and ready workforce for future health crises.

Lastly, we applied a method that helps to gain understanding from disparate country response data and to acknowledge when this information is insufficient to provide categorical evidence. The method we employed finds otherwise hidden patterns and reveals contexts through qualitative research’s meaningful representation of concepts. This shows an explanatory potential, which may not be apparent when using more positivist approaches [72]. Thus, it is based on a cross-national analysis that is intended to drive further research, which is in line with the exploratory and incremental approach we took.

Study limitations

While we took care to reduce biases that are associated with mixing different types of research methods, data source selection, coding reliability and data quantification, our study still has some limitations. Firstly, is the choice of mixed methods to seek to answer the guiding question. This is underpinned by the controversy over the conversion of qualitative data into quantitative forms as a means to gain better understandings of the phenomena being studied. Here we find the possibility of overlooking or decontextualising data and their meanings. This may affect the results and their interpretation, leading to weaker conclusions or less qualified insights. Secondly, is the quality and representativeness of the source data. While the 2021 EU Country Health Profile reports are assumed to be produced with minimum bias, it is possible that the source data and its interpretative summarisation contain existing biases that are reproduced throughout the study affecting the country comparisons, pattern analysis, observations and insights. Lastly, is the simple formula used to calculate the efficacy indicator, for while a pragmatic and an adequate approach for country positioning, it also contributes to intra and inter-country data decontextualisation.

Conclusions

Our study confirms the much-discussed notion that EU member states had variable success responding to the COVID-19 pandemic. While the fragmented and diverse literature landscape made it difficult to discern explanations for this variation, we see our results as useful when preparing for the next or future pandemics.

We found that many European countries employed similar measures, but note they have a wide variation in their case and death rates. Indications of how countries performed during the COVID-19 pandemic were developed through a mixed methods approach to data gathering and analysis. The results confirm and reveal that there is high variation in the efficacy of COVID-19 measures, both within ‘well’ and ‘lower’ performing countries in terms of their COVID-19 case and death rates. For instance, The Netherlands, whose high measure count but had just as many negatives as positives giving it a low efficacy rating, but is found to be lower in deaths, but with higher infection rates than the other EU countries. Nevertheless, the results sustain an expected negative association between the countries’ efficacy of COVID-19 response and their COVID-19 case and death rates.

Therefore, we suggest a range of directions from which more refined research questions can be formulated to better understand the critical question of ‘what worked and why’ during the COVID-19 pandemic. From this, we also propose that future health crisis plans can be improved and new workforce development and response policies can be devised – that will eventually contribute to mitigate the future fiscal and human burdens of new pandemics.

Availability of data and materials

Data used in this study is available from the corresponding author (GR) upon request.

Abbreviations

Covid-19:

Coronavirus disease 2019

ECDC:

European centre for disease control

EU:

European Union

HSRM:

COVID-19 health system response monitor

OECD:

Organisation for economic co-operation and development

QUAL:

Qualitative

QUAN:

Quantitative

WHO:

World health organisation

References

  1. World Health Organization. WHO Director-General’s Opening Remarks at the Media Briefing on COVID-19 - 11 March 2020. World Health Organization. 2020. https://www.who.int/director-general/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19---11-march-2020. Accessed 22 Jul 3AD.

  2. Gordon DV, Grafton RQ, Steinshamn SI. Cross-country effects and policy responses to COVID-19 in 2020: the Nordic countries. Econ Anal Policy. 2021;71:198–210.

    Article  Google Scholar 

  3. Correia T. SARS-CoV-2 pandemics: the lack of critical reflection addressing short- and long-term challenges. Int J Health Plann Manage. 2020;35:669–72.

    Article  PubMed  Google Scholar 

  4. Lewis M. The premonition : a pandemic story. New York: W.W. Norton & Company; 2021.

    Google Scholar 

  5. Correia T. The precariousness of political management of the SARS-CoV-2 pandemic in the search for scientific answers: calling for prudence in public health emergencies. Int J Health Plann Manage. 2021;36:1387–91.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Finn P, Mysore M, Usher O. When nothing is normal: managing in extreme uncertainty | McKinsey. www.mckinsey.com. 2020. https://www.mckinsey.com/capabilities/risk-and-resilience/our-insights/when-nothing-is-normal-managing-in-extreme-uncertainty.

  7. D’Emidio T, Fox Z, Spaner J, Usher O. Crisis management: build resilient organizations | McKinsey. www.mckinsey.com. 2022. https://www.mckinsey.com/capabilities/risk-and-resilience/our-insights/building-resilience-the-history-and-future-of-us-crisis-management.

  8. PricewaterhouseCoopers. Crisis planning and preparation. PwC. 2021. https://www.pwc.com/gx/en/issues/crisis-solutions/crisis-planning-preparation.html.

  9. Butler M, Rivera K. How to respond when a crisis becomes the new normal. strategy+business. 2020. https://www.strategy-business.com/blog/How-to-respond-when-a-crisis-becomes-the-new-normal.

  10. Dussauge-Laguna MI. National government responses to the COVID-19 pandemic: an exploration of policies, factors, and lessons (to be) learned. Rev Asian Pac Stud. 2023;48:121–42.

    Google Scholar 

  11. Kaimann D, Tanneberg I. What containment strategy leads us through the pandemic crisis? An empirical analysis of the measures against the COVID-19 pandemic. PLoS ONE. 2021;16:e0253237.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Ashraf BN, El Ghoul S, Goodell JW, Guedhami O. What does COVID-19 teach us about the role of national culture? Evidence from social distancing restrictions. J Int Finan Markets Inst Money. 2022;80:101647.

    Article  Google Scholar 

  13. Chen Y, Biswas MI. Impact of national culture on the severity of the COVID-19 pandemic. Curr Psychol. 2022;42:15813–26.

    Article  Google Scholar 

  14. Wang Y. Government policies, national culture and social distancing during the first wave of the COVID-19 pandemic: international evidence. Saf Sci. 2021;135:105138.

    Article  PubMed  Google Scholar 

  15. Yu N, Tao L, Zou G. Quantitative relationships between national cultures and the increase in cases of novel coronavirus pneumonia. Sci Rep. 2023;13:1646.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Larnyo E, Tettegah S, Griffin B, Nutakor JA, Preece N, Addai-Dansoh S, et al. Effect of social capital, social support and social network formation on the quality of life of American adults during COVID-19. Sci Rep. 2024;14:2647.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Bollyky TJ, Hulland EN, Barber RM, Collins JK, Kiernan S, Moses M, et al. Pandemic preparedness and COVID-19: an exploratory analysis of infection and fatality rates, and contextual factors associated with preparedness in 177 countries, from Jan 1, 2020, to Sept 30, 2021. Lancet. 2022;399:1489–512.

    Article  Google Scholar 

  18. Topazian RJ, McGinty EE, Han H, Levine AS, Anderson KE, Presskreischer R, et al. US adults’ beliefs about harassing or threatening public health officials during the COVID-19 pandemic. JAMA Netw Open. 2022;5:e2223491.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Lee SK, Sun J, Jang S, Connelly S. Misinformation of COVID-19 vaccines and vaccine hesitancy. Sci Rep. 2022;12:13681.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Stojkoski V, Utkovski Z, Jolakoski P, Tevdovski D, Kocarev L. Correlates of the country differences in the infection and mortality rates during the first wave of the COVID-19 pandemic: evidence from Bayesian model averaging. Sci Rep. 2022;12:7099.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Lan L, Li G, Mehmood MS, Xu T, Wang W, Nie Q. Investigating the spatiotemporal characteristics and medical response during the initial COVID-19 epidemic in six Chinese cities. Sci Rep. 2024;14:7065.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Buchan Policies, Zapata T. Governing health workforce responses during COVID-19. Eurohealth. 2021;27:41–8.

    Google Scholar 

  23. Burau V, Falkenbach M, Neri S, Peckham S, Wallenburg I, Kuhlmann E. Health system resilience and health workforce capacities: comparing health system responses during the COVID-19 pandemic in six European countries. Int J Health Plann Manage. 2022;37:2032–48.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Winkelmann J, Webb E, Williams GA, Hernández-Quevedo C, Maier CB, Panteli D. European countries’ responses in ensuring sufficient physical infrastructure and workforce capacity during the first COVID-19 wave. Health Policy. 2021;126:362–72.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Webb E, Winkelmann J, Scarpetti G, Behmane D, Habicht T, Kahur K, et al. Lessons learned from the Baltic countries’ response to the first wave of COVID-19. Health Policy. 2021;126:438–45.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Džakula A, Banadinović M, Lovrenčić IL, Vajagić M, Dimova A, Rohova M, et al. A comparison of health system responses to COVID-19 in Bulgaria, Croatia and Romania in 2020. Health Policy. 2022;126:456–64.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Schmidt AE, Merkur S, Haindl A, Gerkens S, Gandré C, Or Z, et al. Tackling the COVID-19 pandemic: initial responses in 2020 in selected social health insurance countries in Europe. Health Policy. 2022;126:476–84.

    Article  PubMed  Google Scholar 

  28. Berardi C, Antonini M, Genie MG, Cotugno G, Lanteri A, Melia A, et al. The COVID-19 pandemic in Italy: policy and technology impact on health and non-health outcomes. Health Policy Technol. 2020;9:454–87.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Kennelly B, O’Callaghan M, Coughlan D, Cullinan J, Doherty E, Glynn L, et al. The COVID-19 pandemic in Ireland: an overview of the health service and economic policy response. Health Policy Technol. 2020;9:419–29.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Unruh L, Allin S, Marchildon G, Burke S, Barry S, Siersbaek R, et al. A comparison of health policy responses to the COVID-19 pandemic in Canada, Ireland, the United Kingdom and the United States of America. Health Policy. 2021;126:427–37.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Desson Z, Weller E, McMeekin P, Ammi M. An analysis of the policy responses to the COVID-19 pandemic in France, Belgium, and Canada. Health Policy Technol. 2020;9:430–46.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Huffman A. COVID-19 surges then crickets and the impact on the emergency department workforce. Ann Emerg Med. 2022;79:A11–4.

    Article  PubMed Central  Google Scholar 

  33. Lotta G, Nunes J, Fernandez M, Garcia CM. The impact of the COVID-19 pandemic in the frontline health workforce: perceptions of vulnerability of Brazil’s community health workers. Health Policy OPEN. 2022;3:100065.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Aron JA, Bulteel AJB, Clayman KA, Cornett JA, Filtz K, Heneghan L, et al. Strategies for responding to the COVID-19 pandemic in a rural health system in New York state. Healthcare. 2021;9:100508.

    Article  PubMed  Google Scholar 

  35. Intinarelli G, Wagner LM, Burgel B, Andersen R, Gilliss CL. Nurse practitioner students as an essential workforce: the lessons of coronavirus disease 2019. Nurs Outlook. 2020;69:333–9.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Li D, Howe AC, Astier-Peña M-P. Primary health care response in the management of pandemics: learnings from the COVID-19 pandemic. Atención Primaria. 2021;53(SUPP 1):S93–9.

    Google Scholar 

  37. Peck JL, Sonney J. Exhausted and burned out: COVID-19 emerging impacts threaten the health of the pediatric advanced practice nursing workforce. J Pediatr Health Care. 2021;35:414–24.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Sobers NP, Howitt CH, Jeyaseelan SM, Greaves NS, Harewood H, Murphy MM, et al. Impact of COVID-19 contact tracing on human resources for health – A Caribbean perspective. Preventive Medicine Reports. 2021;22:101367.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Lee D, Choi B. Policies and innovations to battle Covid-19 – A Case study of South Korea. Health Policy Technol. 2020;9:587–97.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Taylor MK, Kinder K, George J, Bazemore A, Mannie C, Phillips R, et al. Multinational primary health care experiences from the initial wave of the COVID-19 pandemic: A qualitative analysis. SSM – Qual Res Health. 2022;2:100041.

    Article  PubMed  PubMed Central  Google Scholar 

  41. van Stralen AC, Carvalho CL, Girardi SN, Massote AW, Cherchiglia ML. Estratégias internacionais de flexibilização da regulação da prática de profissionais de saúde em resposta à pandemia da COVID-19: revisão de escopo. Cad Saude Publica. 2022;38:e00116321.

    Article  Google Scholar 

  42. Bosco A, Tay HW, Aleem I, Citak M, Uvaraj NR, Park JB, et al. Challenges to the orthopedic resident workforce during the first wave of COVID-19 pandemic: lessons learnt from a global cross-sectional survey. J Orthop. 2021;27:103–13.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Wells KJ, Dwyer AJ, Calhoun E, Valverde PA. Community health workers and non-clinical patient navigators: a critical COVID-19 pandemic workforce. Prev Med. 2021;146:106464.

    Article  PubMed  PubMed Central  Google Scholar 

  44. World Health Organization. Impact of COVID-19 on human resources for health and policy response: the case of Plurinational State of Bolivia, Chile, Colombia, Ecuador and Peru: overview of findings from five Latin American countries. Geneva: World Health Organization; 2021.

    Google Scholar 

  45. Nicola M, Sohrabi C, Mathew G, Kerwan A, Al-Jabir A, Griffin M, et al. Health Policy and Leadership Models During the COVID-19 Pandemic- Review Article. Int J Surg. 2020;81:122–9.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Kumpunen S, Webb E, Permanand G, Zheleznyaknov E, Edwards N, van Ginneken E, et al. Transformations in the landscape of primary health care during Covid-19: themes from the European region. Health Policy. 2021;126:391–7.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Merkur S, Maresso A, Cylus J, van Ginneken E, Lessof S. Lessons from the first wave: the COVID-19 Health System Response Monitor (HSPM) an evidence resource and a source of analysis. Eurohealth. 2020;26:5–9.

    Google Scholar 

  48. Weston MJ. Strategic Planning for a Very Different Nursing Workforce. Nurse Lead. 2022;20:152–60.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Busato F, Chiarini B, Cisco G, Ferrara M, Marzano E. Lockdown policies: a macrodynamic perspective for Covid-19. Economia. 2021;22:198–213.

    Article  Google Scholar 

  50. Coates A, Fuad A-O, Hodgson A, Bourgeault IL. Health workforce strategies in response to major health events: a rapid scoping review with lessons learned for the response to the COVID-19 pandemic. Hum Resour Health. 2021;19:154.

    Article  PubMed  PubMed Central  Google Scholar 

  51. De Raeve P, Adams E, Xyrichis A. The impact of the COVID-19 pandemic on nurses in Europe: a critical discussion of policy failures and opportunities for future preparedness. Int J Nurs Stud Adv. 2021;3:100032.

    Article  PubMed  PubMed Central  Google Scholar 

  52. O’Reilly-Jacob M, Perloff J, Sherafat-Kazemzadeh R, Flanagan J. Nurse practitioners’ perception of temporary full practice authority during a COVID-19 surge: a qualitative study. Int J Nurs Stud. 2021;126:104141.

    Article  PubMed  PubMed Central  Google Scholar 

  53. European Observatory on Health Systems and Policies. Overview. eurohealthobservatory.who.int. 2023. https://eurohealthobservatory.who.int/monitors/hsrm/overview.

  54. European Commission. Country Health Profiles. health.ec.europa.eu. 2023. https://health.ec.europa.eu/state-health-eu/country-health-profiles_en.

  55. Sagan A, Webb E, Azzopardi/Muscat N, de la Mata I, McKee M, Figuras J, editors. Health systems resilience during COVID-19: lessons for building back better. Copenhagen: World Health Organisation Europe; 2021.

  56. Ali S, Smith MJ, Stranges S. Where did public health go wrong? Seven lessons from the COVID-19 pandemic. Eur J Public Health. 2024;34:618–9.

  57. Sridhar D. I helped advise the US government on the next likely pandemic. What I learned is alarming. The Guardian. 2024. https://www.theguardian.com/commentisfree/2024/mar/25/us-government-pandemic-virus-vaccine.

  58. Creswell JW, Clark VLP. Designing and conducting mixed methods research. 3rd ed. Los Angeles: Sage; 2018.

    Google Scholar 

  59. Teddlie C, Tashakkori A. Major issues and controversies in the use of mixed methods in the social and behavioral sciences. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social & behavioral research. Thousand Oaks: Sage Publications; 2003. p. 3–50.

    Google Scholar 

  60. Sandelowski M, Voils CI, Knafl G. On quantitizing. J Mixed Methods Res. 2009;3:208–22.

    Article  Google Scholar 

  61. Maxwell JA. Using numbers in qualitative research. Qual Inq. 2010;16:475–82.

    Article  Google Scholar 

  62. Mackieson P, Shlonsky A, Connolly M. Increasing rigor and reducing bias in qualitative research: a document analysis of parliamentary debates using applied thematic analysis. Qual Soc Work. 2018;18:965–80.

    Article  Google Scholar 

  63. Green J, Thorogood N. Qualitative methods for health research. 4th ed. Los Angeles: Sage; 2018.

    Google Scholar 

  64. Benaquisto L. Coding Frame. In: Given LM, editor. The Sage encyclopedia of qualitative research methods 1&2. California: SAGE Publications; 2008. p. 88–9.

    Google Scholar 

  65. van den Hoonaard WC. Inter - and intracoder reliability. In: Given LM, editor. The Sage encyclopedia of qualitative research methods. California: SAGE Publications; 2008. p. 445–6.

    Google Scholar 

  66. Freelon D. ReCal: Intercoder reliability calculation as a web service. Int J Internet Sci. 2010;5:20–33.

    Google Scholar 

  67. O’Connor C, Joffe H. Intercoder reliability in qualitative research: debates and practical guidelines. Int J Qual Methods. 2020;19:1609406919899220.

    Article  Google Scholar 

  68. Julien H. Content analysis. In: Given LM, editor. The Sage encyclopedia of qualitative research methods. California: SAGE Publications; 2008. p. 120–2.

    Google Scholar 

  69. Greenhalgh T, Wherton J, Shaw S, Morrison C. Video consultations for covid-19. BMJ. 2020;368:m998.

    Article  PubMed  Google Scholar 

  70. Onderzoeksraad voor Veiligheid. Aanpak coronacrisis: Deel 2: september 2020 tot july 2021. Den Haag: Onderzoeksraad voor Veiligheid; 2022. https://onderzoeksraad.nl/onderzoek/aanpak-coronacrisis-deel-2-september-2020-juli-2021/.

  71. Doshmangir L, Iezadi S, Gholipour K, Gordeev VS. The future of Iran’s health workforce. The Lancet. 2022;400:883.

    Article  Google Scholar 

  72. Plakoyiannaki E, Wei T, Prashantham S. Rethinking qualitative scholarship in emerging markets: researching, theorizing, and reporting. Manag Organ Rev. 2019;15:217–34.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge the participants and scientific committee of the EHMA conference 2021, where a previous version of this work was presented and feedback received.

Funding

No funding was received to complete this study.

Author information

Authors and Affiliations

Authors

Contributions

All authors participated in the article’s conceptualisation, with RB leading quantitative and qualitative data extraction and quantitative analysis, GR leading the qualitative analysis and CS and RB participating in intercoder reliability testing, with reliability data and analysis conducted by GR. GR led the manuscript drafting, with all authors participating the editing and final approval of the manuscript.

Corresponding author

Correspondence to Gareth H. Rees.

Ethics declarations

Ethics approval and consent to participate

No ethical approval was required for this article.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rees, G.H., Batenburg, R. & Scotter, C. Responding to COVID-19: an exploration of EU country responses and directions for further research. BMC Health Serv Res 24, 1198 (2024). https://doi.org/10.1186/s12913-024-11671-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-024-11671-z

Keywords