Skip to main content

Standardized mortality ratios for regionalized acute cardiovascular care

Abstract

Background

Standardized Mortality Ratios (SMRs) are case-mix adjusted mortality rates per hospital and are used to evaluate quality of care. However, acute care is increasingly organized on a regional level, with more severe patients admitted to specialized hospitals. We hypothesize that the current case-mix adjustment insufficiently captures differences in case-mix between non-specialized and specialized hospitals. We aim to improve the SMR by adding proxies of disease severity to the model and by calculating a regional SMR (RSMR) for acute cerebrovascular disease (CVD) and myocardial infarction (MI).

Methods

We used data from the Dutch National Basic Registration of Hospital Care. We selected all admissions from 2016 to 2018. SMRs and RSMRs were calculated by dividing the observed in-hospital mortality by the expected in-hospital mortality. The expected in-hospital mortality was calculated using logistic regression with adjustment for age, sex, socioeconomic status, severity of main diagnosis, urgency of admission, Charlson comorbidity index, place of residence before admission, month/year of admission, and in-hospital mortality as outcome.

Results

The IQR of hospital SMRs of CVD was 0.85–1.10, median 0.94, with higher SMRs for specialized hospitals (median 1.12, IQR 1.00-1.28, 71%-SMR > 1) than for non-specialized hospitals (median 0.92, IQR 0.82–1.07, 32%-SMR > 1). The IQR of RSMRs was 0.92–1.09, median 1.00. The IQR of hospital SMRs of MI was 0.76–1.14, median 0.98, with higher SMRs for specialized hospitals (median 1.00, IQR 0.89–1.25, 50%-SMR > 1 versus median 0.94, IQR 0.74–1.11, 44%-SMR > 1). The IQR of RSMRs was 0.90–1.08, median 1.00. Adjustment for proxies of disease severity mostly led to lower SMRs of specialized hospitals.

Conclusion

SMRs of acute regionally organized diseases do not only measure differences in quality of care between hospitals, but merely measure differences in case-mix between hospitals. Although the addition of proxies of disease severity improves the model to calculate SMRs, real disease severity scores would be preferred. However, such scores are not available in administrative data. As a consequence, the usefulness of the current SMR as quality indicator is very limited. RSMRs are potentially more useful, since they fit regional organization and might be a more valid representation of quality of care.

Peer Review reports

Introduction

Standardized Mortality Ratios (SMRs) are ratios of observed and expected numbers of deaths per hospital and are frequently used to evaluate quality of hospital care. These SMRs can be used as an alert function and trigger for quality improvement, because high SMRs may reflect general problems concerning quality of care in a hospital [1]. In the Netherlands the hospitals are required to make SMRs public. The SMRs are also used in several other countries like USA, Canada, Sweden, Wales, Australia, France, and Japan [2].

Previously, the validity of the SMRs as an indicator of quality of care was questioned [3,4,5,6,7,8,9,10]. To be able to compare hospital performance, it is needed to adjust for differences in patient characteristics (case-mix) between hospitals. Specialized hospitals often treat patients with a more severe disease and so a higher baseline risk of mortality [3]. Insufficient case-mix adjustment will lead to higher SMRs of specialized hospitals, while they might provide good quality of care [3, 10]. In addition, specialized hospitals receive patients from non-specialized hospitals for specific treatments/interventions. These transferred patients often have more advanced diseases and a higher chance of mortality [11]. The specialized hospital that admit the transferred patients will be negatively affected when a patient dies during the hospital stay, whereas the referring hospital is given a positive outcome (the patient left that hospital still alive) [3, 10]. Such ‘referral bias’ can make the apparent differences in SMRs between specialized and non-specialized hospitals even larger. We hypothesize that the current case-mix adjustment insufficiently captures differences in case-mix between specialized and non-specialized hospitals. Therefore, a regional measured SMR could be a more valid and useful measure of quality of care. Within the regions there is usually a collaboration of specialized and non-specialized hospitals and agreement on where a patient with a specific disease will be treated. These hospitals have a shared responsibility for the acute treatment of a patient. We aim to compare hospital SMRs, for specialized and non-specialized hospitals for two acute diseases for which care is regionally organized: acute cerebrovascular disease and acute myocardial infarction. To better capture the differences in SMRs between specialized and non-specialized hospitals in case-mix, we investigated the influence of additional adjustment for proxies of disease severity. To take into account regionalization in the evaluation of hospital care, we calculated regional SMRs (RSMRs).

Methods

Study design and population

We conducted an observational retrospective cohort study using data from the Dutch National Basic Registration of Hospital Care (LBZ). This database provides data from all general and university hospitals in the Netherlands and contains all hospital admissions. Patients were allocated to diagnostic groups, which are clusters of ICD codes registered in the LBZ [12]. Here the ICD code of the main diagnosis of the admission is used, i.e. the main reason for the hospital stays, which is determined at discharge. In total, 157 diagnostic groups are distinguished. We used the diagnostic groups acute cerebrovascular disease and acute myocardial infarction. We selected all admissions with a discharge date from 1 January  2016 to 31 December 2018. Patients admitted to different hospitals in the same period of time (e.g. transferred patients), were included in every hospital they were admitted. Hospitals were classified as specialized or non-specialized based on the provision of specialized treatments. For cerebrovascular disease, specialized hospitals provide endovascular thrombectomy (EVT) and for myocardial infarction specialized hospitals provide cardiac surgery and endovascular interventions. Acute cerebrovascular disease contains a heterogeneous group of diagnoses, therefore we made a breakdown of the specialized hospitals into endovascular capable centers and comprehensive stroke centers. The patients admitted in two oncology hospitals were excluded from analysis, because in these hospitals there were a very low number of admissions with acute cerebrovascular disease, or acute myocardial infarction. Hospitals with no admissions, only an outpatient clinic, were excluded from analysis and patients not living in the Netherlands were also excluded.

Hospital SMR

We used the same approach as Statistics Netherlands uses to calculate the SMR for each diagnostic group [12]. For the selected diagnostic groups, a prediction model was used to calculate the expected probability of mortality of an admission, with data from all hospitals, adjusted for case-mix. Case-mix adjustment was based on logistic regression models with in-hospital mortality as the dependent variable and age, sex, socioeconomic status (based on the postal code of the patients’ residence), severity of main diagnosis (risk of mortality/probability of death based on historical data), urgency of admission (elective or acute), Charlson comorbidity index, place of residence before admission, and month/year of admission as predictor variables. These models produced an expected mortality probability for each admission. Adding up the probabilities of mortality per hospital gave the total expected number of deaths for that hospital. With this expected mortality number, we calculated the SMR.

$${\text{SMR}}\,dh\,{\text{ = }}\,\frac{{{\text{Observed}}\,{\text{mortality}}\,dh}}{{{\text{Expected}}\,{\text{mortality}}\,dh}}$$

The numerator was the observed number of deaths with main diagnosis d in hospital h. The.

denominator was the expected number of deaths for this type of diagnosis. An SMR of 1 means that the observed in-hospital mortality is the same as the expected in-hospital mortality. An SMR of < 1 is better than expected and > 1 is worse than expected.

Statistics Netherlands reports a case-mix model performance of cerebrovascular disease C-statistic 0.81, and myocardial infarction C-statistic 0.85.

Regional SMR

Regions were determined by the Dutch regional network of acute care (n = 11). Each region contains at least one specialized hospital. The RSMR was calculated with the same regression model as the SMR calculated on a hospital level. This regression model estimates the probability of mortality per patient, which can be summed up per region to get the expected number of deaths. The RSMR can be calculated from this expected number and the observed mortality.

Severity adjustment

We investigated differences in SMRs between specialized and non-specialized hospitals. We hypothesized that these differences are due to differences in case-mix, especially in the severity of the main diagnosis, between hospitals, as previously has been reported [3, 10]. Real severity scores were not available in administrative data. To adjust for severity of the main diagnosis we used specific treatments or interventions performed during admission, registered in the administrative data, as a proxy of disease severity. For every diagnostic group we added these proxies, separately, to the current case-mix adjustment model to investigate their influence on the SMR. For all patients we investigated the influence of inter-hospital transfer and intensive care unit (ICU) admission on the SMR. Inter-hospital transfer was defined as admission or presentation in a general hospital followed by presentation or admission to a specialized hospital within one day. The inter-hospital transfer was only counted for the receiving specialized hospital, since the SMR of the receiving hospital will be negatively affected when the patient dies during the hospital stay. ICU admission was defined as ICU admission in the first two days after admission. For patients with acute cerebrovascular disease we additionally investigated the influence of adding EVT to the case-mix adjustment model. For patients with acute myocardial infarction we investigated the influence of a clinical percutaneous coronary intervention (PCI) and coronary artery bypass grafting (CABG).

Statistical analysis and missing data

We used descriptive statistics to compare the case-mix variables, proxies of diseases severity, and outcomes between specialized and non-specialized hospitals. We reported the interquartile ranges (IQR) and median of SMRs per disease separately for specialized and non-specialized hospitals. In addition, we reported the percentage of SMRs above 1 (higher mortality than expected) for specialized and non-specialized hospitals.

We investigated the influence of additional adjustment for proxies of disease severity on the SMR. In these proxies there were some missing values. These were imputed using single imputation with R based on relevant covariates and outcomes. We performed logistic regression analyses to investigate the association of proxies of disease severity with in-hospital mortality and presented (adjusted) odds ratios (ORs) with 95% confidence intervals (CI). We compared the current case-mix adjustment model with a model with additional adjustment with proxies of disease severity. Performance of the different models was expressed as the area under the receiver operating characteristic (ROC) curve (AUC). To take into account uncertainty due to low numbers of deaths per hospital or region, we performed a sensitivity analysis in which we used a mixed effects logistic regression model (GLMM) with a random intercept for hospital to estimate the SMR and a random intercept for region to estimate the RSMR. We compare these two models to examine whether and how the SMR changes by changing the model. All statistical analyses were performed with R statistical software (version 3.6.1).

Results

Acute cerebrovascular disease

We included 110.155 admissions with acute cerebrovascular disease (Table 1). The most common diagnoses of these admissions were acute cerebral infarction, 86.343 (78%), intracerebral haemorrhage, 13.401 (12%), and subarachnoid haemorrhage 4.698 (4%) (Supplement table SI). There were no clear differences in the percentages of the case-mix variables between specialized and non-specialized hospitals (Table 1). Patients in specialized hospitals had a more severe disease compared with non-specialized hospitals, as indicated by the proxies of disease severity. In specialized hospitals, 8% of the patients were transferred from a general hospital. There were more ICU admissions in specialized hospitals (13%) compared with non-specialized hospitals (4%). EVT was only performed in specialized hospitals (11%). Overall, the in-hospital mortality was 9%.

Table 1 Characteristics of all admissions with acute cerebrovascular disease and acute myocardial infarction between 2016–2018

The IQR of the SMR acute cerebrovascular disease per hospital was 0.85 to 1.10, median 0.94, (Fig. 1A), with higher SMRs for specialized hospitals (median 1.12, IQR 1.00-1.28, 71% SMR > 1) than for non-specialized hospitals (median 0.92, IQR 0.82–1.07, 32% SMR > 1). The IQR of the SMR per region was 0.92 to 1.09, median 1.00 (Fig. 1B).

Patients undergoing ICU admission, EVT, or inter-hospital transfer were more likely to die (aOR: 4.14 [95% CI: 3.86–4.43], aOR: 2.31 [95% CI: 2.06–2.55], aOR: 1.12 [95% CI: 0.98–1.27]) (Table 2). The addition of ICU admission or EVT to the model decreased the SMRs of specialized hospitals and increased the SMRs of non-specialized hospitals (Table 3). Additional adjustment with all proxies of disease severity improved the model (C-statistic: 0.83 (95%CI 0.827–0.835) versus C-statistic: 0.81 (95%CI 0.806–0.815)) compared with the current case-mix adjustment model. As expected the IQR of the hospital SMR and RSMR were smaller in the sensitivity analysis with the mixed effect models compared with the main analysis (Fig. 1C and D).

Table 2 Association between proxies of disease severity and in-hospital mortality in acute cerebrovascular disease and acute myocardial infarction
Fig. 1
figure 1

Standardized mortality ratio (SMR) of acute cerebrovascular disease. (A) calculated on a hospital level with a fixed effects model (B) calculated per region with a fixed effects model (C) calculated on a hospital level with a mixed effects model (D) calculated per region with a mixed effects model

Acute myocardial infarction

We included 101.229 admissions with acute myocardial infarction (Table 1) and the most frequent used ICD code was of acute subendocardial myocardial infarction (56%) (Table SII of the Data Supplement). There were differences in disease severity between specialized and non-specialized hospitals; in specialized hospitals, 13% of the patients were transferred from a general hospital. There were more ICU admissions in specialized hospitals (16%) compared with non-specialized hospitals (2%). PCI treatment was performed in 69% of the admissions in specialized hospitals compared to 29% of the admissions in non-specialized hospitals. CABG was only performed in specialized hospitals, in 11% of the admissions. Overall, the in-hospital mortality was 3%.

The IQR of the SMR acute myocardial infarction per hospital was 0.76–1.14, median 0.98 (Fig. 2A), with higher SMRs for specialized hospitals (median 1.00, IQR 0.89–1.25, 50% SMR > 1) than for non-specialized hospitals (median 0.94, IQR 0.74–1.11, 44% SMR > 1). The IQR of the SMRs per region was 0.90 to 1.08, median 1.00 (Fig. 2B).

Patients undergoing ICU admission, CABG or inter-hospital transfer were more likely to die (aOR: 7.35 [95% CI: 6.65–8.13], aOR: 1.87 [95% CI: 1.51–2.32], (aOR: 2.12 [95% CI: 1.73–2.61]) (Table 2). The addition of ICU admission, CABG, or inter-hospital transfer to the model decreased the SMRs of specialized hospitals, PCI did not (Table 3). Additional adjustment for all proxies of disease severity improved the model (C-statistic: 0.88 (95%CI 0.879–0.891) versus C-statistic: 0.85 (95%CI 0.843–0.856)) compared to the current case-mix adjustment model. Again, the IQRs of the hospital SMR and RSMR were smaller in the sensitivity analysis with the mixed effect models compared with the main analysis (Fig. 2C and D).

Table 3 The influence of adding proxies of disease severity to the case-mix adjustment model on the SMR
Fig. 2
figure 2

Standardized mortality ratio (SMR) of acute myocardial infarction. (A) calculated on a hospital level with a fixed effects model (B) calculated per region with a a fixed effects model (C) calculated on a hospital level with a mixed effects model (D) calculated per region with a mixed effects model

Discussion

In this study we showed that between hospital differences in SMRs at least partly represent differences in case-mix between specialized and non-specialized hospitals instead of quality of care.

We showed large differences in SMRs between specialized and non-specialized hospitals in cerebrovascular disease and myocardial infarction. Most of the specialized hospitals had SMRs above 1. This could mean that the specialized hospitals provide poorer quality of care. However, we argue that it is more likely that these differences are due to differences in case-mix between the specialized and non-specialized hospitals. We tried to improve the case-mix adjustment model with additional adjustment for proxies of disease severity, because a real severity score is not available in administrative data. We do however advocate to improve case-mix adjustment with more specific disease scores instead of with procedures or proxies. Despite the strong association between these proxies and in-hospital mortality the change in SMRs was limited by adding these proxies to the model. This may mean that there are other differences between specialized and non-specialized hospitals not captured in our additional adjustment, or because of the fact that the number of patients to whom the proxies apply is low. Previous studies about the influence of transfer on the SMR showed that the adjustment for transfer barely changed the SMR [10]. The influence of cardiac procedures (PCI, CABG) on the SMR was also relatively small [14, 15]. In our study the impact of additional adjustment for these cardiac procedures on the SMR was relatively small. The proxies of disease severity could be influenced by practice variation and policy variation between hospitals. However, patients with cerebrovascular disease and myocardial infarction are treated conform strict guidelines.

Ideally, deaths are attributed to the hospital that is responsible to the undesirable outcome. In the US CMS (Centers for Medicare and Medicaid Services) reports the death of a transferred patient is only attributed to the referring hospital. This reduces referral bias but a death could also be the result of the quality in the receiving center. The cascade leading to death might have started in the referring hospital, during transport, or only in the receiving specialized hospital. In practice the actual scenario and thus no perfect solution exists. Therefor we attribute deaths in patients that were transferred to the receiving hospital and add ‘transfer’ to the model that produces the number of expected deaths.

The reason for transfer is also unknown. In general, more severe patients will be transferred, but patients at very high risk may not be transferred as they are unlikely to benefit from an intervention. In previous studies, the SMR barely changed after adjustment for inter-hospital transfer [10]. Apparently, transferred patients had similar mortality risk (profile) as patients who were directly admitted to the specialized hospital. One might speculate that the decision to transport patients is dominated by the expected benefits of further (invasive) treatment rather than the estimated mortality risk. The modest influence of cardiac procedures (PCI, CABG) on the SMR, and the low odds of death in patients undergoing PCI can also be explained in this way. Patient undergoing PCI are less severely diseased as patients undergoing CABG. PCI is probably not a good proxy of disease severity. A clinical severity score of the main diagnosis would be preferred to include in the adjustment models. For acute cerebrovascular disease the score on the National Institutes of Health Stroke Scale (NIHSS) is frequently used in clinical registries to indicate the severity of neurologic deficit [16]. For acute myocardial infarction the HEART score could be used. However, these scores are not available in administrative data that are used to calculate SMRs. Further research is needed to investigate the performance of the SMR model when adjustment with disease-specific severity scores were added to the model and whether this reduces the differences in SMRs between specialized and non-specialized hospitals.

A novel approach in our study was to calculate the SMR on a regional level. We showed that the ranges of the regional SMRs are much smaller than ranges of the hospital SMRs. When the SMR is measured on a regional level instead of a hospital level, differences in case-mix between specialized and non-specialized hospitals are less important, since there is a more equal distribution of high-risk patients over the regions, as compared to the distribution over hospitals. In addition, the smaller range can be explained by the statistical uncertainty which is due to a higher number of patients in a region compared to a hospital. The SMR in the Netherlands is currently estimated with fixed effect models. In such a model the estimates are based solely on the observed outcome in each hospital, which could lead to extreme estimates by chance [17]. A mixed effects model shrinks the hospital estimates to the mean, especially in case of small sample sizes, and results in more conservative estimates of ‘performance’ which we consider preferable in case of publicly reported performance estimates such as the SMRs. When we calculated the SMR with a mixed effects model and compared the SMR measured on a hospital level to the RSMR the differences are still present but smaller than in the main analysis.

Given the limitations, of the SMR measured on a hospital level, the RSMR could be a more useful quality indicator. Regional SMRs are more in line with the clinical practice of regionalized care and underlines the shared responsibility of multiple hospitals for the treatment of a patient. We described two acute diseases that are regionally organized and in which the regions are clearly defined with little interregional transfer. This makes it possible to compare regional care. Each region is allowed to decide how their care is arranged. Which implies that there are differences between regions in the care of a patients. We showed that there is only a small variation in SMRs between regions in the Netherlands, however the funnel plot showed that some RSMRs are outside the confidence intervals, so there is potential for improvement. Further research is needed to investigate the underlying causes of the variation between regions.

Strengths and limitations

A strength of the study is the large number of patients we analysed. We used the same data as on which the current SMR is calculated and improved the model using available data, to ensure a direct potential to implement these adaptations.

A limitation of the study is the content of the diagnostic groups. Patients were divided into diagnostic groups, which are clusters of ICD codes. Acute cerebrovascular disease comprises: acute ischemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage, acute nontraumatic subdural haemorrhage, and occlusion and stenosis of cerebral arteries not resulting in cerebral infarction. The probability of mortality differs between these diagnoses. For example, occlusion and stenosis of cerebral arteries not resulting in cerebral infarction (n = 364, 0.3%) do not lead directly to mortality and do not need acute care in contrast to a subarachnoid haemorrhage with a high probability of mortality. It could be questioned if all diagnoses are in the right diagnostic group. We did not change the content of the diagnostic groups, because otherwise it would be difficult to compare the regional SMR model to the current SMR adjustment model. Moreover, the number of patients with a non-acute diagnosis was very small. Length of stay could influence the inhospital mortality and also discharge bias could play a role, however we did not explicitly study this.

Conclusion

SMRs in acute regionally organized diseases vary substantially and at least partly represent differences in case-mix between specialized and non-specialized hospitals instead of quality of care. Although the addition of proxies of disease severity improves the model to calculate SMRs, real disease severity scores would be preferred, However, such scores are not yet available in administrative data. As a consequence, the usefulness of the current SMR measured on a hospital level as quality indicator is very limited. RSMRs could be preferable over hospital SMRs, since they fit regional organization and might be a more valid representation of quality of care.

Data Availability

The data that support the findings of this study are available from DHD but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of DHD.

Abbreviations

CABG:

coronary artery bypass grafting

CI:

confidence interval

CVD:

Cerebrovascular disease

DHD:

Dutch Hospital Data

EVT:

endovascular thrombectomy

ICU:

intensive care unit

IQR:

interquartile range

LBZ:

Dutch National Basic Registration of Hospital Care

MI:

myocardial infarction

OR:

odds ratio

PCI:

percutaneous coronary intervention

RSMR:

Regional Standardized Mortality Ratio

SMR:

Standardized Mortality Ratio

References

  1. Cecil E, Bottle A, Esmail A, Vincent C, Aylin P. What is the relationship between mortality alerts and other indicators of quality of care? A national cross-sectional study. J Health Serv Res Policy. 2020;25(1):13–21.

    Article  PubMed  Google Scholar 

  2. Jarman B, Pieter D, van der Veen AA, Kool RB, Aylin P, Bottle A, Westert GP, Jones S. The hospital standardised mortality ratio: a powerful tool for dutch hospitals to assess their quality of care? Qual Saf Health Care. 2010;19(1):9–13.

    Article  CAS  PubMed  Google Scholar 

  3. van Gestel YR, Lemmens VE, Lingsma HF, de Hingh IH, Rutten HJ, Coebergh JW. The hospital standardized mortality ratio fallacy: a narrative review. Med Care. 2012;50(8):662–7.

    Article  PubMed  Google Scholar 

  4. Lilford R, Pronovost P. Using hospital mortality rates to judge hospital performance: a bad idea that just won’t go away. BMJ. 2010;340:c2016.

    Article  PubMed  Google Scholar 

  5. Mohammed MA, Deeks JJ, Girling A, Rudge G, Carmalt M, Stevens AJ, Lilford RJ. Evidence of methodological bias in hospital standardised mortality ratios: retrospective database study of English hospitals. BMJ. 2009;338:b780.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Thomas JW, Hofer TP. Accuracy of risk-adjusted mortality rate as a measure of hospital quality of care. Med Care. 1999;37(1):83–92.

    Article  CAS  PubMed  Google Scholar 

  7. Pitches DW, Mohammed MA, Lilford RJ. What is the empirical evidence that hospitals with higher-risk adjusted mortality rates provide poorer quality care? A systematic review of the literature. BMC Health Serv Res. 2007;7:91.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Penfold RB, Dean S, Flemons W, Moffatt M. Do hospital standardized mortality ratios measure patient safety? HSMRs in the Winnipeg Regional Health Authority. Healthc Pap. 2008;8(4):8–24. discussion 69–75.

    Article  PubMed  Google Scholar 

  9. Shojania KG, Forster AJ. Hospital mortality: when failure is not a good measure of success. CMAJ. 2008;179(2):153–7.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Pouw ME, Peelen LM, Rinkel GJE, Moons KGM, Kalkman CJ. Standardised mortality ratios of specialised hospitals: the influence of referral bias. Thesis: Measuring standardised mortality ratios of hospitals: Challenges and recommendations 2016.

  11. Seferian EG, Afessa B, Gajic O, Keegan MT, Hubmayr RD, Mayo E. Translational research in intensive C: comparison of community and referral intensive care unit patients in a tertiary medical center: evidence for referral bias in the critically ill. Crit Care Med. 2008;36(10):2779–86.

    Article  PubMed  Google Scholar 

  12. van der Laan JWC, Peters M, de Bruin A. HSMR 2018: methodological report. https://www.cbsnl/nl-nl/onze-diensten/methoden/onderzoeksomschrijvingen/aanvullende-onderzoeksbeschrijvingen/hsmr-2018-methodological-report. Accessed 2 February 2021.

  13. van den Bosch WF, Spreeuwenberg P, Wagner C. [Hospital standardised mortality ratio (HSMR): adjustment for severity of primary diagnosis can be improved] Gestandaardiseerd ziekenhuissterftecijfer (HSMR): correctie voor ernst hoofddiagnose kan beter. Ned Tijdschr Geneeskd. 2011;155(27):A3299.

    PubMed  Google Scholar 

  14. den Ouden AL, van der Wal G. The usefulness of the hospital standardized mortality ratio as an indicator of hospital mortality. Ned Tijdschr Geneeskd. 2008;152(21):1191–2.

    Google Scholar 

  15. van den Bosch WF, Graafmans WC, Pieter D, Westert GP. The effect of specialised medical procedures on the hosptial standardised mortality ratio in cardiac centers. Ned Tijdschr Geneeskd. 2008;152(21):1221–7.

    PubMed  Google Scholar 

  16. Rost NS, Bottle A, Lee JM, Randall M, Middleton S, Shaw L, Thijs V, Rinkel GJ, Hemmen TM. Global comparators stroke gc: stroke severity is a crucial predictor of outcome: an international prospective validation study. J Am Heart Assoc 2016, 5(1).

  17. Lingsma HF, Steyerberg EW, Eijkemans MJ, Dippel DW, Scholte Op Reimer WJ, Van Houwelingen HC. Netherlands Stroke Survey I: comparing and ranking hospitals based on outcome: results from the Netherlands Stroke Survey. QJM. 2010;103(2):99–108.

    Article  CAS  PubMed  Google Scholar 

  18. https://www.dhd.nl/producten-diensten/lbz/paginas/dataverzameling-lbz.aspx. Accessed 2 February 2021.

Download references

Acknowledgements

None.

Funding

No.

Author information

Authors and Affiliations

Authors

Contributions

SH wrote the statistical analysis plan, cleaned and analysed the data, and drafted and revised the paper. BR, HL, SvB wrote the statistical analysis plan and drafted and revised the paper. DD, EB, CD, NvL, MA revised the draft paper. SH, SvB, HL are responsible for the overall content as guarantors. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.

Corresponding author

Correspondence to Sanne J. den Hartog.

Ethics declarations

Ethics approval and consent to participate

Dutch Hospital Data (DHD), the national organization that administers data from all hospitals, gave permission to use the data for this analysis. DHD extracts and processes data according to strict national guidelines for the privacy protection of patients [18]. Only the authors (SH and SB) had access to the data. This access was granted within the scope of DHD’s privacy policy. No third parties had access to the data for this project. Therefore, no waiver of ethics approval was needed from an Institutional Review Board or ethics committee following Dutch Guidelines. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

den Hartog, S.J., Roozenbeek, B., van der Bij, S. et al. Standardized mortality ratios for regionalized acute cardiovascular care. BMC Health Serv Res 23, 951 (2023). https://doi.org/10.1186/s12913-023-09883-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-023-09883-w

Keywords