Skip to main content

Inefficiency of public hospitals: a multistage data envelopment analysis in an Italian region

Abstract

Background

The objective of this study was to assess public hospital efficiency, including quality outputs, inefficiency determinants, and changes to efficiency over time, in an Italian region. To achieve this aim, the study used secondary data from the Veneto region for the years 2018 and 2019.

Methods

A nonparametric approach—that is, multistage data envelopment analysis (DEA)—was applied to a sample of 43 hospitals. We identified three categories of input: capital investments (Beds), labor (FTE), operating expenses. We selected five efficiency outputs (outpatient visits, inpatients, outpatient visit revenue, inpatient revenue, bed occupancy rate) and two quality outputs (mortality rate and inappropriate admission rate). Efficiency scores were estimated and decomposed into two components. Slack analysis was then conducted. Further, DEA efficiency scores were regressed on internal and external variables using a Tobit model. Finally, the Malmquist Productivity Index was applied.

Results

On average, the hospitals in the Veneto region operated at more than 95% efficiency. Technical and scale inefficiencies often occurred jointly, with 77% of inefficient hospitals needing a downsizing strategy to gain efficiency. The inputs identified as needing significant reductions were full-time employee (FTE) administrative staff and technicians. The size of the hospital in relation to the size of the population served and the length of patient stay were important factors for the efficiency score. The major cause of decreased efficiency over time was technical change (0.908) rather than efficiency change (0.974).

Conclusions

The study reveals improvements that should be made from both the policy and managerial perspectives. Hospital size is an important feature of inefficiency. On average, the results show that it is advisable for hospitals to reorganize nonmedical staff to enhance efficiency. Further, increasing technology investment could enable higher efficiency levels.

Peer Review reports

Background

The recent global economic crisis has influenced the budgets of public organizations, including those of public hospitals and national health systems in general. According to the World Health Organization, for high-income and upper-middle-income countries, such as Italy, the main challenge relating to the provision of health services is to continue improving efficiency, quality, and equity [1]. Moreover, within the evolving social and economic environment, budget constraints have prompted a search for new ways of monitoring and controlling organizational finances that focus on the efficient and effective use of public resources [2]. Monitoring the performance of healthcare providers is a relevant issue worldwide, particularly in contexts such as hospitals given their significant effect on population health and the economy. The main problem facing hospitals has been inefficient use of existing resources rather than a lack of resources [3]. Therefore, hospital efficiency plays a strategic role in healthcare organizations. Assessing the determinants of efficiency allows managers to formulate appropriate organizational strategies to meet the challenges associated with continuous change. Focusing on the Italian National Health System, Guerrini et al. [4] note that in recent years, increasing attention has been paid to ensuring financial equilibrium and reducing the average annual growth rate of total health expenditure per capita. Italy has a regionally based National Health System that provides universal coverage free of charge [5]. Regional governments allocate resources to healthcare organizations and have a significant degree of autonomy in organizing the provision of healthcare planning and monitoring, as well as determining the number and vocation of healthcare providers [6], and the size of hospitals in terms of number of beds. One of the most important choices of Italy’s regional governments in relation to hospitals is the number of beds.

Literature review

Over the past 30 years, many research studies on hospital efficiency evaluation have been conducted in different countries. Färe et al. [7] published the first European study, and this was followed by many others [8,9,10,11,12,13]. Such studies have now been conducted in many other countries around the world, including in China, Iran, Brazil, Ukraine, and Angola [14,15,16,17,18].

Hospital efficiency has been evaluated in relation to many factors (see Table 1). Kohl et al. [19] consider capital investment of great importance in efficiency analysis. Chilingerian and Sherman [20] note the importance of labor to the service process in hospitals, identifying this factor as essential in assessing performance. In addition, other organizational choices, such as teaching status and the provision of first aid, are often considered relevant in assessing hospital efficiency in the literature [21]. A frequently investigated aspect of hospital efficiency is patient length of stay. Among previous studies there is consensus that an increase in the number of hospitalization days has a negative effect on hospital performance [22, 23]. Rebba and Rizzi [24] have found that a high number of beds per inhabitant is one of the major causes of hospital inefficiency because it increases overhead costs. In addition, Daidone and D’Amico [25] and Shahhoseini et al. [26] argue that efficiency is positively affected by hospital size. Nevertheless, this relationship remains controversial because research such as that of Nayar et al. [27] has found that small hospitals have higher efficiency and quality scores than do large hospitals. In addition, Chang [28] argues that the number of service types offered is negatively related to efficiency because a greater scope of service means a higher level of management complexity. This is supported by Campedelli et al. [29], who found that hospitals that provide a first-aid service incur higher costs than hospitals that do not provide this service.

Table 1 Factors affecting hospital efficiency

Many studies on hospital efficiency assess pure technical efficiency (PTE) and scale efficiency [30, 31]. PTE represents managerial efficiency [32], which refers to management’s ability to save inputs to produce a certain amount of outputs or to produce more outputs with a given level of inputs [33, 34]. Scale efficiency indicates whether an organization operates at the most productive scale size [35].

However, despite the numerous studies on hospital efficiency conducted in many different countries, few studies have attempted to include quality measures [36], particularly those undertaken in the European context. One possible reason for this gap could be that the scientific community has not yet agreed on a common standard for addressing questions of quality in hospitals [19]. However, according to Chatfield [37], efficiency studies without quality considerations are neglecting a critical factor. Some research that does consider quality has been conducted in the United States [38, 39], and it identifies mortality rate as a good valuation of quality.

The purpose of this study was to assess the efficiency of public hospitals while including measures of quality. To fulfill this aim, the study attempted to answer the following research questions: (1) What are the main organizational factors that generate hospital inefficiency? (2) How do internal and external features affect hospital efficiency? (3) How has hospital efficiency changed over time? To answer these questions, the study employed multistage data envelopment analysis (DEA).

Data envelopment analysis

DEA is a nonparametric technique developed by Charnes et al. [40]. It is used to rank and compare the efficiency of various entities, defined as decision-making units (DMUs). DEA is grounded in an optimization algorithm that assigns a score between 0 and 1 to the DMUs given the input consumed and the output produced. DEA models allow assessment of the relative efficiency of DMUs by creating a production frontier using the best practice of the observed data. In addition, DEA can be considered an alternative to parametric frontier methods and financial ratio analysis. Rhodes [41] highlights that financial ratios allow benchmarking among a multitude of operating units, focusing on their financial results. Nayar et al. [27] note that the main flaw in measuring performance using this kind of methodology is the lack of technical indicators that enable evaluation of the efficiency of structures and the quality of services provided. According to Worthington [42] and O’Neill et al. [43], methods of nonparametric analysis such as DEA overcome the weaknesses of financial ratios and parametric analysis because they do not require any assumption related to the functional form of the relationship between outputs and inputs [44]. Further, DEA can not only identify inefficient units but can also assess the degree of inefficiency. DEA uses linear programming to construct a piecewise convex linear-segmented efficiency frontier, making it more flexible than econometric frontier analysis. Moreover, DEA can include multiple inputs and outputs. Despite these identified benefits, DEA presents a drawback: it attributes every deviation from the best practice frontier to inefficiency. However, such deviations might be due to statistical noise (e.g., measurement errors).

Methods

Study design

The study adopted a cross-sectional design to assess the efficiency of public hospitals in the Veneto region for the years 2018 and 2019. It used a longitudinal design to analyze the trend in technical efficiency in general hospitals from 2018 to 2019. More specifically, a Charnes, Cooper, and Rhodes (CCR) input-oriented model, decomposition of the obtained scores, and slack assessment were developed for 2018 and 2019. Subsequently, Tobit regression was applied to understand the internal and external sources of inefficiency. Finally, the Malmquist Productivity Index was used to assess how efficiency has changed over time.

Study population

To conduct the efficiency analysis, the Veneto region was selected as the case study site [45]. This was because of the region’s high level of interest in researching new ways to control the efficiency of public hospitals. Guerrini et al. [3] found that this region has specific characteristics (i.e., an increase in the elderly population and growing life expectancy of residents) that have led to a gradual increase in comorbidity and chronic diseases and a corresponding increase in demand for high-quality healthcare services. All the data used in this study were provided by Azienda Zero UOC Controllo di Gestione e Adempimenti LEA (Azienda Zero). A full dataset from 2018 to 2019 containing nonpublicly available technical data cost and revenue items was analyzed. The data included in this study are the operative costs and revenue for public hospitals, number of beds, number of FTEs, mortality rate, inappropriate admission rate, bed occupancy rate, length of stay, provision of first aid, and number of residents. During the period under analysis, the Veneto region had 53 public hospitals. One hospital closed in 2018, three hospitals treated only a particular type of patient and were therefore considered nonhomogeneous and noncomparable, and six hospitals presented missing data. Therefore, the final sample included 43 hospitals and 86 observations.

Selection of study variables

Selecting suitable inputs and outputs is crucial for ensuring meaningful efficiency analysis. To obtain the discriminative power of DEA, Dyson et al. [46] recommend being parsimonious with the number of inputs and outputs. According to these researchers, the number of DMUs should always be larger than two∙(#inputs + #outputs) to ensure sufficient discrimination between units. In this research, six inputs and seven outputs were chosen. Therefore, this rule was not violated.

Following Ozcan [47], we used three categories of inputs: capital investment, labor, and operating expenses. As suggested by Kohl et al. [19], a good proxy for capital investment is the number of beds (Beds). This variable is widely used in the literature [48,49,50,51]. As a proxy for labor, we used the number of hospital FTEs. Chilingerian and Sherman [20] note the relevance of labor to hospital efficiency. In addition, these researchers advise distinguishing between different types of personnel. Thus, we identified four categories of personnel: medical (FTE Med), nursing (FTE Nurse), administrative (FTE Admin), and technical (FTE Tech). As a proxy for operating expenses, we used the operating costs (Cost) of hospitals, which is a commonly used variable in the literature [52,53,54].

For the outputs, we used five variables to evaluate hospital efficiency and two variables to evaluate hospital quality. For efficiency-output variables for hospitals, Ozcan [47] advises including inpatient and outpatient visits. Thus, we used the number of outpatient visits (Outpatients) and the number of case-mix-adjusted inpatients (Adj. Inpatients). A commonly used output is operating revenue [55,56,57]. To measure this output, we used revenues from inpatient visits (Inpatient Revenue) and revenues from outpatient visits (Outpatient Revenue). Another variable commonly used to analyze hospital efficiency [58, 59] is bed occupancy rate (BOR), which we also included as an efficiency-output measure. We used two measures to evaluate hospital quality as an output. The first quality measure was mortality rate, which is often used in hospital efficiency analyses that incorporate quality [39, 60, 61]. The second quality measure is a new variable that has not been used in previous literature: the potential inappropriate admission rate, which is the ratio between admissions attributed to diagnosis-related groups at a high risk of inappropriateness and admissions attributed to no risk of inappropriateness. This measure represents the hospital’s ability to classify patients and receive adequate remuneration for hospital activities. These two quality variables are considered undesirable outputs. Incorporating undesirable outputs in DEA research has been discussed in previous studies. One idea [62] is to apply inversion transformation to the value of undesirable factors. However, the transformed values are usually negative, making it difficult for DEA modeling. To overcome this difficulty, one solution is to add a constant to the transformed undesirable factors to keep them positive [63]. Unfortunately, the determination of this constant has a significant effect on DMU ranking and classification. Färe and Grosskopf [64] propose an alternative approach of using a directional distance function. However, the form of the direction vector could affect the DMU ranking. Thus, in some cases, an improper transformation may generate a result that is inverse from what is expected. According to Guo et al. [65], a widely used form of transformation is reciprocal transformation, which this study adopted based on previous literature [66,67,68,69].

To examine the factors that affect the productivity of public hospitals in the Veneto region, we considered institutional factors (i.e., factors that can be controlled by hospital management) and contextual factors (i.e., factors that are beyond the control of hospital management) to estimate their effects on performance. Based on previous literature, we identified five regressors. The first regressor was beds per resident (Beds_Residents), which represents the size of the hospital in relation to the served population [3]. The second regressor was average length of stay (ALOS) [70, 71]. The third regressor was teaching status (Teach), a dummy variable that represents whether a hospital is attached to a university [72]. The fourth regressor was type of hospital [21], another dummy variable that explains whether a structure is labeled a HUB hospital. The fifth regressor was first aid (FIRST-AID) [3], another dummy variable that indicates whether the hospital does or does not provide first aid.

Model specification

The two basic DEA models are the CCR [40] and the Banker, Charnes, and Cooper (BCC) [73] models. According to Kohl et al. [19], despite countless extensions, almost 80% of studies conducted in the hospital sector apply one of these two basic models. The discussion about which of these two models is superior (and therefore the assumption of constant return to scale [CRS] or variable return to scale [VRS]) has been ongoing since their invention. It is unlikely there will ever be a universally valid answer to that question. However, Banker et al. [74] demonstrate that the CCR model yields better results for small sample sizes of up to 50 DMUs, while the BCC model performs better with larger sample sizes (at least 100 DMUs). Thus, this study used the CCR model. In relation to the choice of model orientation, Alatawi et al. [48] note that the most common orientation employed in studies using DEA analysis are input orientation (i.e., minimizing inputs with a given amount of outputs) and output orientation (i.e., holding inputs constant and proportionally increasing outputs). This study employed input orientation (CCR-I), with the choice being guided by previous literature [43] and because generally, hospitals have more control over their inputs than they do over their outputs. The mathematical formulation is presented below:

$$min\theta -\varepsilon \left(\sum_{i=1}^m{s}_i^{-}+\sum_{r=1}^s{s}_r^{+}\right)$$

subject to:

$$\sum_{j=1}^n{x}_{ij}{\lambda}_j+{s}_i^{-}=\theta {x}_{i0}\ i=1,2,\dots, m;$$
$$\sum_{j=1}^n{y}_{rj}{\lambda}_j+{s}_r^{+}={y}_{r0}\ r=1,2,\dots, s;$$
$${\lambda}_j\ge 0\ j=1,2,\dots, n$$

Data analysis

A Microsoft Excel database was constructed for all the input, output, and predictor variables using the data provided by Azienda Zero. The Excel data were exported to Gretl software to generate the descriptive statistics. The data were then imported into MaxDEA Basic software to run the CCR-I model, decompose the scores, and assess the slacks.

As stated, when running basic DEA models, it is necessary to choose between CRS or VRS models and the orientation (input or output) of these models. In research, to define scale efficiency (SE), CRS and VRS models are often used simultaneously. The technical efficiency (TE) that can be calculated by CRS models is the overall technical efficiency (OTE). However, VRS specification allows evaluation of PTE to purify the score from the SE effects [73]. From the SE perspective, the sum of lambdas is used to appraise the return to scale (RTS). It is assumed that if a DMU shows a sum of lambdas equal to 1, it operates in CRS; if the sum is less than 1, the DMU has increasing return to scale (IRS); if the sum is greater than 1, the DMU has a decreasing return to scale (DRS). In CRS, the DMU operates in the most productive scale size. In DRS, DMU should reduce its input level, which is referred to as a “downsizing decision.” In IRS, this is opposed to the DRS situation, and the DMU should make an “upsizing decision.”

According to Yildirim [75], it is possible to decompose DEA OTE into its sources of possible inefficiency. This has been achieved using the following formula:

$$\mathrm{OTE}=\mathrm{PTE}\ \mathrm{x}\ \mathrm{SE}$$

CRS (i.e., CCR) and VRS (i.e., BCC) models were established in this study in consideration of the RTS conditions. CRS models assume a constant rate of substitution between inputs and outputs, while VRS models assume that a proportional increase in input level causes a proportionally larger or smaller increase in output level. If CRS and VRS efficiency scores are not equal, this indicates inefficiency of the scale. SE expresses how close the firm is to the optimal scale size: the larger the scale efficiency, the closer the firm is to the optimal scale size [76].

Once the performance of a hospital has been captured by the DEA score, it is useful to attempt to identify some of the factors that could affect efficiency. Kirigia and Asbu [70] identify different regression techniques that have been applied to estimate the effect of contextual factors on efficiency, including ordinary least squares and the maximum-likelihood-based probit, logit, and truncated regression (e.g., Tobit regression). Tobit regression has been widely used in previous literature [3, 70, 77] and was chosen for this study because it can describe the relationship between a nonnegative dependent variable and independent variables. Following Sultan and Crispim [78], the CCR efficiency scores in this study were transformed into inefficiency scores and left-censored at 0 using the following formula:

$$Inefficiency\ score=\left(\frac{1}{CCR\ score}\right)-1$$

Simar and Wilson [79] note that this approach is not without criticism. That is, DEA scores are expected to be reciprocally correlated because the calculation of one DMU includes the observation of all other DMUs and thus suffers from multicollinearity problems. However, according to Banker and Natarajan [80], a DEA approach with Tobit or ordinary least squares functions outperforms the one-stage parametric method (e.g., Cobb–Douglas, Translog) in defining the production frontier.

Accordingly, the estimated Tobit model for the study was specified as follows:

$$\mathrm{INEFFICIENCY}={\upbeta}_0+{\upbeta}_1\mathrm{BedsPop}+{\upbeta}_2\mathrm{ALOS}+{\upbeta}_3\mathrm{Teach}+{\upbeta}_4\mathrm{Type}+{\upbeta}_5\mathrm{FirstAid}+\upvarepsilon,$$

where β is the vector of unknown coefficients and ε is the stochastic/random error term. Using Gretl software, we tested the hypothesis that βn is not significantly different from 0 in either direction. Thus, the null (H0) and alternative hypotheses (HA) are as follows: H0: βn = 0 and HA: βn ≠ 0.

A drawback of basic DEA models is that scores cannot be directly assessed over time. To overcome this issue using DEAP software and following Mujasi and Kirigia [81], we used an input-oriented Malmquist Productivity Index to examine efficiency and productivity changes over the study period (2018–2019). According to Coelli et al. [82], the Malmquist Productivity Index allows identification of the contributions that innovation (technical change) and diffusion and learning (catching up or efficiency change) make to productivity growth. This index attains a value greater than, equal to, or less than 1 if a hospital has experienced productivity growth, stagnation, or productivity decline, respectively. The Malmquist Productivity Index can be broken down into various sources of productivity change, referred to as “technical change” and “efficiency change.” Technical change (TECHCH) is the measure of change in hospital production technology; it measures the shift in technology use over years. Efficiency change (EFFCH) represents change in the gap between observed production and the production frontier between two different years. When estimated under the assumption of CRS, EFFCH can be further broken down into pure efficiency change (PECH) and scale efficiency change (SECH). SECH refers to productivity change resulting from scale change that brings the hospital closer to or further away from the optimal scale of inputs as identified by VRS technology. PECH measures change in TE under the assumption of a VRS technology [81].

Results

Descriptive statistics

Descriptive statistics for input and output variables for the years 2018 and 2019 are presented in Table 2.

Table 2 Descriptive statistics for input and output variables

In 2018, the 43 public hospitals under analysis used a total of 13,982 beds, 4,601,644,861.97 of costs, and 43,200.81 of (medical, nursing, administrative, and technical) FTEs to generate 59,478,432 outpatient visits and 2,983,535,271.00 of revenue. In addition, the averages of the numbers of adjusted inpatients and the BOR were 13,928.37 and 74%, respectively, while the inappropriate admission and mortality rate means were 18 and 4%, respectively. In 2019, the number of beds and costs decreased to 13,819.00 and 4,588,391,031.74, respectively, while the number of FTEs in analysis increased to 43,205. units. For output, revenues increased to 3,035,460,338.27 and outpatient visits to 63,033,348 in total. On average, the number of adjusted inpatients, the BOR, and the mortality rate increased to 14,005.53, 75, and 5%, respectively; in addition, the inappropriate admission rate decreased to 16%.

Overall technical efficiency scores

The results of the first stage of analysis are presented in Table 3.

Table 3 Summary of CCR-I scores (2018–19)

In 2018, having 28 DMUs were considered efficient, which means that more than 65% of hospitals did not need any input reduction, given the level of quantitative and qualitative outputs. In 2019, 23 DMUs (53%) were considered efficient. The averages of the CCR-I scores were 0.9756 for the first year of analysis (2018), decreasing to 0.9533 in the second year of analysis (2019).

Pure technical and scale efficiency

The hospitals’ TE, PTE, and SE scores are presented in Table 4.

Table 4 OTE, PTE, and SE scores (2018 and 2019)

As a result of this stage of analysis, overall efficiency was decomposed into pure technical and scale efficiency. It was observed that among the 15 inefficient hospitals in 2018 that were identified, six had only scale inefficiencies, while nine had both managerial and scale inefficiencies. The average TE and SE scores for 2018 were 0.984 and 0.991, respectively. TE scores ranged from 0.810 to 1. For the OTE scores, inefficient hospitals had a minimum score of 0.7692 and a maximum of 0.998, which means that they were able to decrease input usage from 0.2 to 23.08% without changing their qualitative and quantitative output levels. In 2019, 20 hospitals were inefficient. Seven had both inefficiencies (i.e., PTE and SE) and 13 did not have managerial inefficiencies, and only scale inefficiencies. The TE and SE average scores were 0.966 and 0.986, respectively. The minimum TE score was 0.795. Therefore, the results highlight that for both years (2018 and 2019), in most cases, both the input utilization and the scale of the hospitals were causes of hospital inefficiency. The results present an average PTE greater than OTE. In relation to RTS, Table 2 reveals that among the total of 35 inefficient hospitals, 27 presented DRS, which means that 77% of inefficient hospitals needed a downsizing strategy to gain efficiency. The other eight hospitals presented IRS, which indicates that 23% of inefficient hospitals needed to increase their size to gain efficiency. That is, we found a greater prevalence of DRS than IRS in the inefficient hospitals.

Input and output slack assessment of inefficient hospitals

Inefficiency is caused by noneffective use of inputs and/or outputs [83]. Accordingly, input and output slacks are factors that contribute to inefficiencies. It is important for managers to determine the target values, to measure against targets, and to control their improvement progress. Good operations management decreases output shortage levels (the quantities of output slacks) by employing surplus resources (the quantities of input slacks added to the proportional reduction of inputs [50]). Hence, the evaluation of slack variables is crucial for efficiency improvement. As stated, one of the variables presented (i.e., the number of beds) is not controllable by management because this number is chosen by regional governments in Italy. The input and output slacks of inefficient hospitals are presented in Table 5.

Table 5 Input and output slacks of inefficient hospitals

In this table, the minus sign indicates that a further reduction is necessary to obtain an improvement for the variable; the plus sign indicates that a further increment of the variable is required. The results show that beyond the proportional movement, the inputs that were controllable by management and that needed a particularly significant reduction were, on average, FTE administrative staff and technicians. The output that required the biggest increase was the number of outpatient visits. The analysis reveals that a significant reduction of the quality output variables was not needed.

Internal and external sources of inefficiency

The selected Tobit model for explaining the observed hospital inefficiencies in stage four of the analysis was presented in the Data Analysis section. Table 6 presents the results of the Tobit regression model.

Table 6 Results of Tobit model

The results reveal that the coefficient for Beds_Residents (i.e., the size per population served) had a positive sign, statistically significant at the 5% level. This means that the higher a hospital’s Beds_Residents, the higher the predicted inefficiency score. The relationship between size and efficiency remains controversial in the literature. In this study, Tobit regression analysis indicated that size negatively affected the efficiency scores. Average length of stay (ALOS) was the second significant variable (p < 0.05) and had a positive sign, which means that it was positively correlated with inefficiency. The other three regressors (i.e., type of hospital, teaching status, and first-aid provision) were not statistically significant at the 5% level.

Efficiency evaluation over time

Table 7 presents the Malmquist Productivity Index summary of annual geometric means.

Table 7 Malmquist Productivity Index results

Analyzing the total factor productivity change (TFPCH) (i.e., the Malmquist Productivity Index) in the last column number 6 of Table 7 reveals that on average, TFPCH decreased slightly by 11.5% over the 2018–2019 period. The minimum score was 0.125 and the maximum score was 2.417. Nineteen (44.2%) hospitals had TFPCH scores greater than 1, indicating growth in productivity. The productivity growth in 13 hospitals was attributed to technological innovation only (with EFFCH equal to or less than 1); one hospital performed better because of efficiency improvements only. In contrast, 24 hospitals (55.8%) had a TFPCH of less than 1, indicating a decrease in productivity. Productivity decrease in three hospitals was because of a decline in efficiency only. In nine hospitals, productivity decrease was attributed to a decrease in both efficiency and innovation, while in 12 hospitals, productivity decrease was attributed to a decrease in technological innovation. PECH, which refers to management efficiency change, and SECH, which refers to scale efficiency change, are the two components of EFFCH. PECH had an average score of 0.994 and SECH had an average score of 0.98. Seven hospitals had a PECH greater than 1, 29 equal to 1, and seven less than 1. The SECH scores indicate that seven hospitals increased their scale efficiency, 23 hospitals kept their scale efficiency constant, and 13 decreased their scale efficiency. The PECH and SECH results reveal that the major cause of the decrease was TECHCH (0.908) rather than EFFCH (0.974).

Discussion

Analyzing hospital efficiency is a highly important area of research in healthcare management [84] because it enables identification of inefficiency sources and supports corrective action so that scarce resources are not wasted. Using an input-oriented CCR model in which even the quality variables (i.e., mortality rate and inappropriate admission rate) were held constant, the analysis reveals that on average, the efficiency of hospitals in the Veneto region was high and was even higher than has been found in previous literature [3].

For most of the inefficient hospitals identified in the study, the results highlight that for the two years under investigation, both input utilization and hospital scale caused inefficiency. Previous literature reports an average PTE that is greater than the OTE [75, 84].

In relation to the inefficient hospitals, the study highlights the importance of making improvements in both policymaking and managerial capacities. According to the scale efficiency analysis, hospital size (i.e., the number of beds) is one of the most important features in inefficiency. That is, in relation to RTS, the results are consistent with those of Kirigia and Asbu [70] and Tlotlego [85], showing that there was a prevalence of DRS over IRS for inefficient hospitals, meaning that most inefficient hospitals needed a downsizing strategy to gain efficiency.

In addition, the findings of the slack assessment suggest that the inefficient hospitals needed to reorganize FTEs, particularly improving the efficiency of administrative and technical staff.

The results for the analysis of the external and internal factors related to efficiency show partial agreement with previous studies. Specifically, in line with Rebba and Rizzi [24] and Nayar et al. [27], the Tobit regression analysis in our study indicates that hospital size negatively affects efficiency scores. However, these results differ from the findings of Daidone and D’Amico [25]. Average length of stay (ALOS) was also a significant variable (p < 0.05). It had a positive sign, which means that it was positively related to inefficiency, and this correlation has been identified in several other studies [22, 23].

Unlike Kakeman et al. [21], we found that the type of hospital was a nonsignificant driver of efficiency. We also found that teaching status was a nonsignificant factor of efficiency, which aligns with the findings of Kakeman et al. [21] but contrasts with those of Sarabi Asiabar et al. [86] and Ali et al. [87]. In addition, the nonsignificance of the finding for the presence of first-aid provision differs greatly from previous literature [28, 29].

In assessing efficiency over time, we found that the major cause of the decrease in productivity was technical change rather than efficiency change. The finding that average technical change was lower than average efficiency change is in line with the findings of Li et al. [88] and Tlotlego et al. [85]. Technological progress (or decline) depends on different factors, including the availability of appropriate healthcare technology, access to new technologies at affordable prices, the availability of training opportunities that enable the workforce to acquire new skills and take full advantage of new technological possibilities, and the availability of funds to finance necessary healthcare technology investments [89].

Limitations and further research

This study has some limitations. Beyond the typical limitations of basic DEA models [19], some issues are specific to this study. One is that we incorporated only a limited number of quality measures into the model, excluding measures such as hospital readmission rates because of the unavailability of data. In addition, an examination of demographic factors such as the age of population served could reveal significant information for the hospitals we analyzed. In addition, further research could extend this study by assessing hospital efficiency in other Italian regions.

Conclusions

This study contributes to the healthcare management literature because few previous studies in the European context have analyzed hospital efficiency using quality measures as outputs [36]. To fill this research gap, this research used hospitals in the Veneto region as a case study, identifying the organizational causes that affected efficiency, also considering quality measures, and how these changed over a period of two years. The results reveal that more than half of the hospitals under review were efficient. For the hospitals that were found to be inefficient, many had both input utilization and scale inefficiency. This study also provides empirical evidence for the main causes of inefficiency. Hospital size is one of the most important sources of inefficiency. In addition, administrative and technical staff often cause inefficiency. The contextual factors that most influence efficiency are the average patient length of stay and hospital size with respect to the population served. The role of technology is crucial to maintaining or increasing efficiency levels over time.

DEA is a good method for measuring hospital performance, in addition to the budgeting process that has traditionally been used in Italy. DEA is particularly useful for its ability to estimate the volume of inputs and outputs that can be optimized and its capacity to identify the main sources of inefficiency.

The results also underscore that a rethink of hospital size on the part of policymakers would be particularly valuable for increasing the current efficiency levels of many hospitals and maintaining constant and high service quality. Improving performance also depends on improving staff efficiency, particularly administrative and technical staff. This may be achieved by providing capacity building such as training staff members on efficient resource utilization. Finally, the results suggest that hospital managers should pay significant attention to advancements in technology and the skills needed to employ new technology in the best possible way.

Availability of data and materials

The data that support the findings of this study are available from Azienda Zero—Veneto Region, but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. However, data are available from the authors upon reasonable request and with the permission of Azienda Zero—Veneto Region.

Abbreviations

ALOS:

Average length of stay

BCC:

Banker, Charnes, and Cooper

BOR:

Bed occupancy rate

CCR:

Charnes, Cooper, and Rhodes

CRS:

Constant return to scale

DEA:

Data envelopment analysis

DMU:

Decision-making unit

DRS:

Decreasing return to scale

EFFCH:

Efficiency change

FTE:

Full-time employee

IRS:

Increasing return to scale

OTE:

Overall technical efficiency

PECH:

Pure efficiency change

PTE:

Pure technical efficiency

RTS:

Return to scale

SE:

Scale efficiency

SECH:

Scale efficiency change

TE:

Technical efficiency

TECHCH:

Technical change

TFPCH:

Total factor productivity change

VRS:

Variable return to scale.

References

  1. World Health Organization. Primary health care on the road to universal health coverage: 2019 monitoring report. Geneva: WHO; 2019.

    Google Scholar 

  2. Rondeau KV, Wagar TH. Downsizing and organizational restructuring: what is the impact on hospital performance? Int J Public Adm. 2003;26(14):1647–68.

    Article  Google Scholar 

  3. Sahin I, Ozcan YA. Public sector hospital efficiency for provincial markets in Turkey. J Med Syst. 2000;24(6):307–20.

    Article  CAS  PubMed  Google Scholar 

  4. Guerrini A, Romano G, Campedelli B, Leardini C. Public vs. private in hospital efficiency: exploring determinants in a competitive environment. Int. J Public Adm. 2018;41(3):181–9.

    Article  Google Scholar 

  5. Lega F, Sartirana M. Making doctors manage … but how? Recent developments in the Italian NHS. BMC Health Serv Res. 2016;16(2):65–72.

    Google Scholar 

  6. Ferré F, de Belvis AG, Valerio L, Longhi S, Lazzari A, Fattore G. Italy: health system review. Health Syst Transit. 2014;16(4):1–168.

    PubMed  Google Scholar 

  7. Färe R, Grosskopf S, Lindgren B, Roos P. Productivity developments in Swedish hospitals: a Malmquist output index approach. In: Charnes A, Cooper WW, Lewin AY, Seiford LM, editors. Data envelopment analysis: theory, methodology, and applications. New York: Springer; 1994. p. 253–72.

    Chapter  Google Scholar 

  8. Kontodimopoulos N, Niakas D. Efficiency measurement of hemodialysis units in Greece with data envelopment analysis. Health Policy. 2005;71(2):195–204.

    Article  PubMed  Google Scholar 

  9. Dotoli M, Epicoco N, Falagario M, Sciancalepore F. A cross-efficiency fuzzy data envelopment analysis technique for performance evaluation of decision making units under uncertainty. Comput Industrial Eng. 2015;79:103–14.

    Article  Google Scholar 

  10. Flokou A, Aletras V, Niakas D. Decomposition of potential efficiency gains from hospital mergers in Greece. Health Care Manag Sci. 2017;20(4):467–84.

    Article  PubMed  Google Scholar 

  11. Herwartz H, Strumann C. On the effect of prospective payment on local hospital competition in Germany. Health Care Manag Sci. 2012;15(1):48–62.

    Article  PubMed  Google Scholar 

  12. van Ineveld M, van Oostrum J, Vermeulen R, Steenhoek A, van de Klundert J. Productivity and quality of Dutch hospitals during system reform. Health Care Manag Sci. 2016;19(3):279–90.

    Article  PubMed  Google Scholar 

  13. Siciliani L. Estimating technical efficiency in the hospital sector with panel data. Appl Health Econ Health Policy. 2006;5(2):99–116.

    Article  PubMed  Google Scholar 

  14. Liu X, Mills A. The effect of performance-related pay of hospital doctors on hospital behaviour: a case study from Shandong. China Hum Resour Health. 2005;3(11):1–12.

    Google Scholar 

  15. Marnani AB, Sadeghifar J, Pourmohammadi K, Mostafaie D, Abolhalaj M, Bastani P. Performance assessment indicators: how DEA and Pabon lasso describe Iranian hospitals’ performance. HealthMED. 2012;13(3):791–6.

    Google Scholar 

  16. Araújo C, Barros CP, Wanke P. Efficiency determinants and capacity issues in Brazilian for-profit hospitals. Health Care Manag Sci. 2014;17(2):1–13.

    Article  Google Scholar 

  17. Pilyavsky AI, Aaronson WE, Bernet PM, Rosko MD, Valdmanis VG, Golubchikov MV. East–west: does it make a difference to hospital efficiencies in Ukraine? Health Econ. 2006;15(11):1173–86.

    Article  PubMed  Google Scholar 

  18. Kirigia J, Emrouznejad A, Cassoma B, Asbu EZ, Barry S. A performance assessment method for hospitals: the case of municipal hospitals in Angola. J Med Syst. 2008;32(6):509–19.

    Article  PubMed  Google Scholar 

  19. Kohl S, Schoenfelder J, Fügener A, Brunner JO. The use of data envelopment analysis (DEA) in healthcare with a focus on hospitals. Health Care Manag Sci. 2019;22(2):245–86.

    Article  PubMed  Google Scholar 

  20. Chilingerian JH, Sherman HD. Health-care applications: from hospitals to physicians, from productive efficiency to quality frontiers. In: Cooper WW, Seiford LM, Zhu J, editors. Handbook on data envelopment analysis. New York: Springer; 2011. p. 445–93.

    Chapter  Google Scholar 

  21. Kakeman E, Forushani AR, Dargahi H. Technical efficiency of hospitals in Tehran. Iran Iranian J Public Health. 2016;45(4):494–502.

    Google Scholar 

  22. Staat M. Efficiency of hospitals in Germany: a DEA–bootstrap approach. Appl Econ. 2006;38(19):2255–63.

    Article  Google Scholar 

  23. Dimas G, Goula A, Soulis S. Productive performance and its components in Greek public hospitals. Oper Res. 2012;12(1):15–27.

    Google Scholar 

  24. Rebba V, Rizzi D. Analisi dell’efficienza relativa delle strutture di ricovero con il metodo DEA: il caso degli ospedali del Veneto. Venice: Universita degli Studi di Venezia; 2000.

    Google Scholar 

  25. Daidone S, D’Amico F. Technical efficiency, specialization and ownership form: evidences from a pooling of Italian hospitals. J Productivity Anal. 2009;32(3):203–16.

    Article  Google Scholar 

  26. Shahhoseini R, Tofighi S, Jaafaripooyan E, Safiaryan R. Efficiency measurement in developing countries: application of data envelopment analysis for Iranian hospitals. Health Serv Manag Res. 2011;24(2):75–80.

    Article  Google Scholar 

  27. Nayar P, Ozcan YA, Yu F, Nguyen AT. Benchmarking urban acute care hospitals: efficiency and quality perspectives. Health Care Manag Rev. 2013;38(2):137–45.

    Article  Google Scholar 

  28. Chang F-K. Structural health monitoring: current status and perspectives. Stanford: CRC Press; 1998.

    Google Scholar 

  29. Campedelli B, Guerrini A, Romano G, Leardini C. La performance della rete ospedaliera pubblica della regione Veneto. L’impatto delle variabili ambientali e operative sull’efficienza, vol. 92. Mecosan: Management ed Economia Sanitaria; 2014. p. 119–31.

    Google Scholar 

  30. Dash U, Vaishnavi SD, Muraleedharan VR. Technical efficiency and scale efficiency of district hospitals: a case study. J Health Manag. 2010;12(3):231–48.

    Article  Google Scholar 

  31. Ram Jat T, San SM. Technical efficiency of public district hospitals in Madhya Pradesh, India: a data envelopment analysis. Glob Health Action. 2013;6(1):21742.

    Article  Google Scholar 

  32. Yusefzadeh H, Ghaderi H, Bagherzade R, Barouni M. The efficiency and budgeting of public hospitals: case study of Iran. Iranian Red Crescent Med J. 2013;15(5):393–9.

    Article  Google Scholar 

  33. Kumar S, Gulati R. An examination of technical, pure technical, and scale efficiencies in Indian public sector banks using data envelopment analysis. Eurasian J Bus Econ. 2008;1(2):33–69.

    Google Scholar 

  34. Taib CA, Ashraf MS, Razimi MSA. Technical, pure technical and scale efficiency: a non-parametric approach of Pakistan’s insurance and takaful industry. Acad Account Financ Stud J. 2018;22(1):1–11.

    Google Scholar 

  35. Farrell MJ. The measurement of productive efficiency. J Royal Stat Soc. 1957;120(3):253–90.

    Article  Google Scholar 

  36. Nayar P, Ozcan Y. Data envelopment analysis comparison of hospital efficiency and quality. J Med Syst. 2008;32(3):193–9.

    Article  PubMed  Google Scholar 

  37. Chatfield JS. Data envelopment analysis comparison of hospital efficiency, quality and control. Int J Manag Account Res. 2014;4(1):93–109.

    Google Scholar 

  38. Valdmanis VG, Rosko MD, Mutter RL. Hospital quality, efficiency, and input slack differentials. Health Serv Res. 2008;43(5p2):1830–48.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Clement JP, Valdmanis VG, Bazzoli GJ, Zhao M, Chukmaitov A. Is more better? An analysis of hospital outcomes and efficiency with a DEA model of output congestion. Health Care Manag Sci. 2008;11(1):67–77.

    Article  PubMed  Google Scholar 

  40. Charnes A, Cooper WW, Rhodes E. Measuring the efficiency of decision making units. Eur J Oper Res. 1978;2(6):429–44.

    Article  Google Scholar 

  41. Rhodes EL. Using data envelopment analysis (DEA) to evaluate environmental quality and justice: a different way of looking at the same old numbers. Int J Public Adm. 2002;25(2–3):253–79.

    Article  Google Scholar 

  42. Worthington AC. Frontier efficiency measurement in health care: a review of empirical techniques and selected applications. Med Care Res Rev. 2004;61(2):135–70.

    Article  PubMed  Google Scholar 

  43. O’Neill L, Rauner M, Heidenberger K, Kraus M. A cross-national comparison and taxonomy of DEA-based hospital efficiency studies. Socio Econ Plan Sci. 2008;42(3):158–89.

    Article  Google Scholar 

  44. Zhang X, Tone K, Lu Y. Impact of the local public hospital reform on the efficiency of medium-sized hospitals in Japan: an improved slacks-based measure data envelopment analysis approach. Health Serv Res. 2018;53(2):896–918.

    Article  PubMed  Google Scholar 

  45. Yin RK. Validity and generalization in future case study evaluations. Evaluation. 2013;19(3):321–32.

    Article  Google Scholar 

  46. Dyson RG, Allen R, Camanho AS, Podinovski VV, Sarrico CS, Shale EA. Pitfalls and protocols in DEA. Eur J Oper Res. 2001;132(2):245–59.

    Article  Google Scholar 

  47. Ozcan YA. Health care benchmarking and performance evaluation: an assessment using data envelopment analysis (DEA). Berlin: Springer; 2014.

    Book  Google Scholar 

  48. Alatawi AD, Niessen LW, Khan JA. Efficiency evaluation of public hospitals in Saudi Arabia: an application of data envelopment analysis. BMJ Open. 2020;10(1):e031924.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Ghahremanloo M, Hasani A, Amiri M, Hashemi-Tabatabaei M, Keshavarz-Ghorabaee M, Ustinovičius L. A novel DEA model for hospital performance evaluation based on the measurement of efficiency, effectiveness, and productivity. Eng Manag Prod Serv. 2019;12(1):7–19.

  50. Sultan WI, Crispim J. Evaluating the productive efficiency of Jordanian public hospitals. Int J Bus Manag. 2016;12(1):68–83.

    Article  Google Scholar 

  51. Nabilou B, Yusefzadeh H, Rezapour A, Ebadi Fard Azar F, Salem Safi P, Sarabi Asiabar A, et al. The productivity and its barriers in public hospitals: case study of ran. Med J Islamic Repub. Iran. 2016;30:316.

    Google Scholar 

  52. Hatefi SM, Haeri A. Evaluating hospital performance using an integrated balanced scorecard and fuzzy data envelopment analysis. J Health Manag Inform. 2019;6(2):66–76.

    Google Scholar 

  53. Lyroudi K, Glaveli N, Koulakiotis A, Angelidis D. The productive performance of public hospital clinics in Greece: a case study. Health Serv Manag Res. 2006;19(2):67–72.

    Article  CAS  Google Scholar 

  54. Allin S, Veillard J, Wang L, Grignon M. How can health system efficiency be improved in Canada? Healthc Policy. 2015;11(1):33–45.

    PubMed  PubMed Central  Google Scholar 

  55. Nistor CS, Ștefănescu CA, Crișan AR. Performance through efficiency in the public healthcare system—a DEA approach in an emergent country. Studia Universitatis Babes-Bolyai Oeconomica. 2017;62(1):31–49.

    Article  Google Scholar 

  56. Gai R, Zhou C, Xu L, Zhu M, Wang X, Li X. Health resource allocation and productive efficiency of Chinese county hospitals: data from 1993 to 2005. Biosci Trends. 2010;4(5):218–24.

    PubMed  Google Scholar 

  57. Park JS, Fowler KL, Giebel SA. Measuring hospital operating efficiencies for strategic decisions. Int J Bus Soc Sci. 2011;2(13):56–60.

    Google Scholar 

  58. Dexter F, O’Neill L, Xin L, Ledolter J. Sensitivity of super-efficient data envelopment analysis results to individual decision-making units: an example of surgical workload by specialty. Health Care Manag Sci. 2008;11(4):307–18.

    Article  PubMed  Google Scholar 

  59. Köse T, Uçkun N, Girginer N. An efficiency analysis of the clinical departments of a public hospital in Eskisehir by using DEA. Glob J Adv Pure Appl Sci. 2014;4:252–8.

    Google Scholar 

  60. Bilsel M, Davutyan N. Hospital efficiency with risk adjusted mortality as undesirable output: the Turkish case. Ann Oper Res. 2011;221(1):73–88.

    Article  Google Scholar 

  61. Ferrier GD, Trivitt JS. Incorporating quality into the measurement of hospital efficiency: a double DEA approach. J Prod Anal. 2013;40(3):337–55.

    Article  Google Scholar 

  62. Koopmans TC. Analysis of production as an efficient combination of activities. Activity analysis of production and allocation. New York: Wiley; 1951. p. 33–7.

    Google Scholar 

  63. Pastor JT, Aparicio J. Translation invariance in data envelopment analysis. In: Zhu J, editor. Data envelopment analysis. Boston: Springer; 2015. p. 245–68.

    Chapter  Google Scholar 

  64. Färe R, Grosskopf S. Modeling undesirable factors in efficiency evaluation: comment. Eur J Oper Res. 2004;157(1):242–5.

    Article  Google Scholar 

  65. Guo H, Zhao Y, Niu T, Tsui KL. Hong Kong Hospital Authority resource efficiency evaluation: via a novel DEA-Malmquist model and Tobit regression model. PLoS One. 2017;12(9):e0184211.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Golany B, Roll Y. An application procedure for DEA. Omega. 1989;17(3):237–50.

    Article  Google Scholar 

  67. Knox Lovell CA, Pastor JT, Turner JA. Measuring macroeconomic performance in the OECD: a comparison of European and non-European countries. Eur J Oper Res. 1995;3(87):507–18.

    Article  Google Scholar 

  68. Sharp JA, Meng W, Liu WA. Modified slacks-based measure model for data envelopment analysis with “natural” negative outputs and inputs. J Oper Res Soc. 2007;58(12):1672–7.

    Article  Google Scholar 

  69. See KF, Yen SH. Does happiness matter to health system efficiency? A performance analysis. Health Econ Rev. 2018;8(1):1–10.

    Article  Google Scholar 

  70. Kirigia JM, Asbu EZ. Technical and scale efficiency of public community hospitals in Eritrea: an exploratory study. Health Econ Rev. 2013;3(1):1–16.

    Article  Google Scholar 

  71. Mujasi PN, Asbu EZ, Puig-Junoy J. How efficient are referral hospitals in Uganda? A data envelopment analysis and Tobit regression approach. BMC Health Serv Res. 2016;16(1):230.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Ayiko R, Mujasi PN, Abaliwano J, Turyareeba D, Enyaku R, Anguyo R. Levels, trends and determinants of technical efficiency of general hospitals in Uganda: data envelopment analysis and Tobit regression analysis. BMC Health Serv Res. 2020;20(1):1–12.

    Article  Google Scholar 

  73. Banker RD, Charnes A, Cooper WW. Some models for estimating technical and scale inefficiencies in data envelopment analysis. Manag Sci. 1984;30(9):1078–92.

    Article  Google Scholar 

  74. Banker RD, Chang H, Cooper WW. Simulation studies of efficiency, returns to scale and misspecification with nonlinear functions in DEA. Ann Oper Res. 1996;66(4):231–53.

    Article  Google Scholar 

  75. Yildirim C, Kacak H, Yildirim S, Kavuncubasi S. Comprehensive efficiency assessment of Turkish teaching hospitals: technical, pure technical and scale efficiencies with data envelopment analysis. J Appl Bus Econ. 2019;21(3):124–40.

  76. Bogetoft P, Otto L. Benchmarking with DEA, SFA, and R, vol. 157. New York: Springer; 2011.

    Google Scholar 

  77. Wang X, Luo H, Qin X, Feng J, Gao H, Feng Q. Evaluation of performance and impacts of maternal and child health hospital services using data envelopment analysis in Guangxi Zhuang autonomous region, China: a comparison study among poverty and non-poverty county level hospitals. Int J Equity Health. 2016;15(1):1–6.

    Article  Google Scholar 

  78. Sultan WI, Crispim J. Measuring the efficiency of Palestinian public hospitals during 2010–2015: an application of a two-stage DEA method. BMC Health Serv Res. 2018;18(1):1–17.

    Article  Google Scholar 

  79. Simar L, Wilson PW. Estimation and inference in two-stage, semi-parametric models of production processes. J Econom. 2007;136(1):31–64.

    Article  Google Scholar 

  80. Banker RD, Natarajan R. Evaluating contextual variables affecting productivity using data envelopment analysis. Oper Res. 2008;56(1):48–58.

    Article  Google Scholar 

  81. Mujasi PN, Kirigia JM. Productivity and efficiency changes in referral hospitals in Uganda: an application of Malmquist Total productivity index. Health Syst Policy Res. 2016;3(1):1–12.

  82. Coelli TJ, Rao DSP, O’Donnell CJ, Battese GE. An introduction to productivity and efficiency analysis. New York: Springer Science; 1998.

    Book  Google Scholar 

  83. Cooper WW, Seiford LM, Zhu J. Handbook on data envelopment analysis. Boston: Kluwer Academic Publishers; 2004.

    Book  Google Scholar 

  84. Ozcan YA. Health care benchmarking and performance evaluation. New York: Springer; 2008.

    Book  Google Scholar 

  85. Tlotlego N, Nonvignon J, Sambo LG, Asbu EZ, Kirigia JM. Assessment of productivity of hospitals in Botswana: a DEA application. Int Arch Med. 2010;3(1):1–14.

    Article  Google Scholar 

  86. Sarabi Asiaba A, Sharifi T, Rezapour A, Khatami Firouzabadi SMA, Haghighat-Fard P, Saeed M-P. Technical efficiency and its affecting factors in Tehran’s public hospitals: DEA approach and Tobit regression. Med J Islamic Repub. Iran. 2020;34(1):1228–36.

    Google Scholar 

  87. Ali M, Debela M, Bamud T. Technical efficiency of selected hospitals in eastern Ethiopia. Health Econ Rev. 2017;7(1):1–13.

    Article  Google Scholar 

  88. Li NN, Wang CH, Ni H, Wang H. Efficiency and productivity of county-level public hospitals based on the data envelopment analysis model and Malmquist index in Anhui, China. Chin Med J. 2017;130(23):2836.

    Article  PubMed  PubMed Central  Google Scholar 

  89. Killick T. Policy economics: a textbook of applied economics on developing countries. London: Heinemann; 1982.

    Google Scholar 

Download references

Acknowledgements

The authors are grateful to Azienda Zero UOC Controllo di Gestione e Adempimenti LEA for granting us the use of the secondary data.

Funding

The authors did not receive any funding for this study.

Author information

Authors and Affiliations

Authors

Contributions

LPO and CL conceptualized the study and contributed to the data collection, review of literature, data analysis, interpretation of results and writing of the manuscript. SV and BC contributed to data analysis and review of the manuscript. All authors read and approved the submitted manuscript.

Corresponding author

Correspondence to Luca Piubello Orsini.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Piubello Orsini, L., Leardini, C., Vernizzi, S. et al. Inefficiency of public hospitals: a multistage data envelopment analysis in an Italian region. BMC Health Serv Res 21, 1281 (2021). https://doi.org/10.1186/s12913-021-07276-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-021-07276-5

Keywords