Skip to main content

Table 3 Approaches taken to identify high performance

From: High performing hospitals: a qualitative systematic review of associated factors and practical strategies for improvement

Study

Data sources

Measure type

Methodological/statistical approach used to identify high performing sites

Bradley et al. (2006) [10]

National Registry of Myocardial Infarction

Process

Hospitals who treated patients with ST-segment-elevated myocardial infarction undergoing percutaneous coronary intervention (PCI) were selected (n = 151). Within this group hospitals with median door-to-balloon times of ≤90 min for their most recent 50 PCI cases were selected (n = 35). All 35 hospitals were approached for interview. Interviews occurred until thematic saturation (after 11 hospitals).

Mannion et al. (2005) [44]

NHS Star ratings

Other (rating)

Hospitals were identified using the NHS star rating. Four low (0 or 1 star) and 2 high (3 star) performing hospitals were included.

Sautter et al. (2007) [37]

Blue Cross Blue Shield of Michigan Participating Hospitals Program

Process

All participating hospitals (n = 54) were categorised by the extent of change in left ventricular ejection fraction (LVEF) assessment achieved over a three-year period. There were four groups: Always optimal, Improved to optimal, Improved, and Not improved. 10 hospitals in total were sampled, covering all four categories and varying in size, teaching status and geography.

Cherlin et al. (2013) [112]

Centers for Medicare & Medicaid Services (CMS) Hospital Compare website

Outcome

US hospitals were selected as high or low performers if their 30-day risk standardized mortality rates were in the top or bottom 5 %, respectively, for two consecutive years. Within the top 5 % hospitals were ranked best to worst performers and in the bottom 5 % they were ranked worst to best. The hospitals were asked to participate one at a time until theoretical saturation (reached after 14 hospitals). For top performers only those that remained in the top 5 % for a third consecutive year were retained (n = 7)

Landman et al. (2013) [32]

CMS Hospital Compare Web

Outcome

30-day risk-standardized mortality rates were calculated by dividing the hospital’s predicted number of deaths by the expected number of deaths within 30 days of admission. Hospitals were eligible for inclusion as high or low performers if their risk-standardized mortality rate was in the top 5 % or bottom 5 % of performance for 2 consecutive years.

Rangachari (2008) [31]

New York State hospital administrative database

Process

Hospitals were categorised as good and poor performers using the percentage of uncertain coding (0-5 % = good, 95 % - 100 % = poor). A purposeful sample of two good and two poor performers were selected from those willing to participate in the study.

Baumann et al. (2007) [47]

NS

Outcome, Other (reporting, rating)

A multistage process was used: First, statistical model was used to shortlist authorities. The model used a range of data from 1998 to 2000 to predict rates of discharge delay from acute hospitals. The authors examined rates of delays and emergency readmissions data for these sites over a longer period (1998–2002) to ensure sustained high performance. These results were cross referenced with joint review reports by health and social care inspectorates. Star ratings and delayed discharge performance data for hospitals and primary care trusts within the short-listed authorities were examined to ensure good performance. Hospitals selected as high performers ensured a mix of geography and local authority type.

Rose et al. (2012) [45]

Veterans Administration (details cited in Rose 2010 [110]and Rose 2011 [111])

Output

Rankings of risk adjusted anticoagulation control at Veterans Administration sites were used to identify the best 10 and worst 10 sites. Three of the top 10 and three of the bottom 10 were selected, representing six different states.

Keroack et al. (2007) [34]

University HealthSystem Consortium, an alliance of 97 university teaching hospitals

Process, Output, Outcome, Other (equity score)

A composite index of patient-level process and outcomes data, including indicators on preventable complications and mortality rates, evidence based practices and equity of care, was calculated from discharge abstract data from 79 Academic Medical Centers. Six institutions (three top and three average performers) were selected for site visits, covering different geographical areas and levels of hospital ownership.

Hockey and Bates (2010) [42]

Hospital Quality Alliance program

Outcome

Hospital Quality Alliance (HQA) data were used to identify hospitals with consistent performance during the preceding two years from the top (high performing) and bottom deciles (low performing) on internal medicine outcome measures (pneumonia and congestive heart failure). Twelve hospitals were consistent performers (either high or low) and all were asked to participate in the study. Five hospitals agreed to participate (3 high performers and 2 low performers).

Adelman (2012) [41]

Malcolm Baldrige National Quality Award (MBNQA) or state-level Baldrige award

Other (award recipient)

Hospitals which had won either a MBNQA or state level Balridge award in the last 7 years were the target sample. Two MBNQA and two Balridge recipients participated.

VanDeusen Lukas et al. (2010) [36]

Veterans Administration (VA) Network

Process

Of 3 interested VA networks, a single network was chosen, which included 7 sites. An organisation model positing that implementation of EBPs is enhanced by the presence of three critical organizational components (active top leadership commitment; links to senior management structures and processes; multi-disciplinary evidence-based clinical process redesign) was implemented in all 7 sites. Hand-hygiene compliance scores and the overall fidelity of the model was calculated for each site. Site with a score over 3 were considered high fidelity (n = 4).

Kramer et al. (2008) [40]

The National Magnet Hospital Profile

Essentials of Magnetism (EOM) score

The EOM scale (available for 76 EOM tested hospitals) was used to identify the highest and second highest scoring hospitals in each region of the country (8 regions in total). A balanced sample of highest and second highest scoring hospital was taken reflecting hospital type (community/academic). To select local units within the hospital, staff nurse experience, certification, and education of local units were correlated with that of the sample; the unit was included if differences in the three sets of correlations were not significant. In addition to adequate Registered Nurse representation, staff nurses’ reported Control over Nursing Practice (CNP) scores had to be above the hospital mean for the unit to be included.

Parsons and Cornett (2011) [35]

American Nurses Association Magnet Recognition Program

Other (rating)

A convenience sample of Chief Nursing Officers was selected to be interviewed from the whole pool of magnet recognised hospitals that were willing to participate (n = 15).

Curry et al. (2011) [38]

Centers for Medicare & Medicaid Services Hospital Compare

Outcome

Hospitals that ranked in the top 5 % of performance on risk-standardized mortality rates for acute myocardial infarction care during both years were eligible for inclusion. Selection continued until theoretical saturation, which occurred after 7 high performing hospitals.

Puoane et al. (2008) [33]

Unclear

Outcome

11 hospitals were given the same intervention to improve care of malnourished children. Two high performing and two poorly performing hospitals were purposively selected based on their improvement or lack of improvement in case-fatality rates.

Stanger et al. (2012) [43]

Blood Stocks Management Scheme

Output

Historical data on the inventory management of blood stock from 277 hospitals was used to create an indicator on blood waste (wasted as a percentage of issue (WAPI)). Seven units with low WAPI percentage were selected for further analysis.

Kane et al. (2009) [109]

Data provided by the hospitals to the state (regulatory requirements)

Outcome

Researchers calculated the most recent available five consecutive years of operating margin performance, including annual performance and the average annual trend. Only non-teaching, non-profit, acute care hospitals were included in the sampling frame. Hospitals were stratified by local market areas and ranked on operating margin. 36 top performing hospitals were selected based on these rankings and 6 agreed to participate.

Olson et al. (2011) [48]

American Heart Association (AHA) /American Stroke Association

Process

Top-performing sites were defined as those in the top 1 % of all hospitals contributing to the “Get with the Guidelines – Stroke” programme (n = 1315) for achieving a door-to-needle time of less than 60 min. Hospitals administering tPA to fewer than 12 patients (average of less than one patient per month) were excluded (n = 960). All hospitals who were asked to participate agreed. 13 personnel in total at 7 top performing hospitals were interviewed.

  1. NS = information not stated in paper