Skip to main content

Combining patient, clinical and system perspectives in assessing performance in healthcare: an integrated measurement framework

Abstract

Background

The science of measuring and reporting on the performance of healthcare systems is rapidly evolving. In the past decade, across many jurisdictions, organisations tasked with monitoring progress towards reform targets have broadened their purview to take a more system-functioning approach. Their aim is to bring clarity to performance assessment, using relevant and robust concepts – and avoiding reductionist measures – to build a whole-of-system view of performance. Existing performance frameworks are not fully aligned with these developments.

Methods

An eight stage process to develop a conceptual framework incorporated literature review, mapping, categorisation, integration, synthesis and validation of performance constructs that have been used by organisations and researchers in order to assess, reflect and report on healthcare performance.

Results

A total of 19 performance frameworks were identified and included in the review. Existing frameworks mostly adopted either a logic model (inputs, outputs and outcomes), a functional, or a goal-achievement approach. The mapping process identified 110 performance terms and concepts. These were integrated, synthesised and resynthesised to produce a framework that features 12 derived constructs reflecting combinations of patients’ needs and expectations; healthcare resources and structures; receipt and experience of healthcare services; healthcare processes, functions and context; and healthcare outcomes. The 12 constructs gauge performance in terms of coverage, accessibility, appropriateness, effectiveness, safety, productivity, efficiency, impact, sustainability, resilience, adaptability and equity. They reflect four performance perspectives (patient, population, delivery organisation and system).

Conclusions

Internationally, healthcare systems and researchers have used a variety of terms to categorise indicators of healthcare performance, however few frameworks are based on a theoretically-based conceptual underpinning. The proposed framework incorporates a manageable number of performance domains that together provide a comprehensive assessment, as well as conceptual and operational clarity and coherence that support multifaceted measurement systems for healthcare.

Peer Review reports

Background

Across healthcare systems in high income countries, there is an established consensus that independent and impartial assessment of performance is an essential part of quality improvement efforts. Organisations and agencies that specialise in healthcare performance measurement and reporting act to oversee system functioning, promote accountability, highlight variation, identify areas for improvement, and make information available to leverage and support change [1, 2].

There is a growing recognition of the important role played by public reporting in healthcare [3, 4]. It confers positive effects as a lever for improvement but also has potential for negative unintended consequences, such as gaming or a blinkered preoccupation with a small number of published, often easily measurable, metrics. The power of public reporting means there is an imperative to accurately, fairly and meaningfully measure and report comparative information. Given the complexity of healthcare systems, this is a real challenge. Healthcare services – the principal subject of performance reporting efforts - are shaped, directly and indirectly, by a wide array of organisations and professionals working with patients. There is a huge variety and volume of tasks undertaken to diagnose, deliver, support, guide, and assure provision of care that improves peoples’ health.

Clinicians’ sensitivity to comparative data and the strong debates that media coverage can generate means that reporting must be comprehensive, systematic and rigorous. Seeking breadth and comprehensiveness in performance reporting has seen a burgeoning of measures that reflect different aspects of performance in complex systems. However, this has contributed to what the Institute of Medicine has called ‘indicator chaos’, suggesting that there are too many indicators, and poor delineation [5]. Paradoxically, at the same time, there are concerns that indicators or concepts have been too narrowly focused and these have led to calls for broader and more expansive measurement of outcomes and value [6, 7].

This paradox may in part be a reflection of the absence of a clear definition of high performance [8] and a lack of conceptual clarity about how to assess performance domains within complex adaptive systems. Frameworks feature in many settings and have been used to guide public reporting efforts [9,10,11,12,13,14,15,16,17,18,19]. These existing frameworks have been successful in sorting and classifying different metrics into related thematic areas such as access, patient-centeredness, safety or efficiency. They are often a reflection of whatever data are available and the particular aspects of healthcare delivery that are the focus of current policies or priorities.

However, few are clearly grounded in theory or make explicit links between conceptualisation and operationalisation of performance measurement [1, 20, 21]. Many existing frameworks are populated through the use of a Delphi process to select healthcare quality indicators [22]. While useful and insightful for many applications, the use of Delphi processes for indicator selection or for framework construction – does not necessarily result in a clear, conceptually sound result.

This paper seeks to apply a theoretically grounded approach to performance framework development - drawing on similar efforts to link multidisciplinary bodies of knowledge in non-health contexts. Jabareen [23] refers to a conceptual framework as a network of interlinked concepts that together provide a comprehensive understanding of a phenomenon or phenomena and asserts that building such a framework is an iterative process – one that requires an understanding of the relationships between the concepts that provide the building blocks for the overall framework. “A conceptual framework is not merely a collection of concepts, but rather a construct in which each concept plays an integral role” (p51). We used such an approach to inform the creation of comprehensive, conceptually grounded measurement systems.

Methods

An eight-phase approach, described by Jabareen (2009), guided the framework development (Fig. 1). These eight phases are: 1) Mapping selected data sources; 2) Categorising of the selected data; 3) Identifying and naming concepts; 4) Deconstructing and categorising the concepts; 5) Integrating concepts; 6) Synthesis, resynthesis and making sense; 7) Validating the conceptual framework; 8) Rethinking the conceptual framework. While elements of our approach resonate with those of a scoping review (24, 25) our purpose was not to map the available evidence, but to collect the range and distribution of concepts which have been used to measure performance and their theoretical basis.

Fig. 1
figure1

Schematic of the framework development approach

A review of the academic and grey literature identified concepts and frameworks related to the measurement of performance in healthcare. A targeted search of the websites of performance reporting agencies at international, national and, when appropriate, subnational levels; jurisdictional health ministries or departments; and major health services research organisations was supplemented by a rapid literature review based on searches of the Database of Abstracts of Reviews of Effectiveness (DARE); Cochrane Database of Systematic reviews; and PubMed electronic data base. Search terms were: “conceptual framework” AND performance; “concept map*” AND performance; “performance framework”. PubMed searches (to June 2017) identified 648 articles.

Citations were screened for suitability for inclusion by one author (KS). A total of 62 articles and 27 grey literature reports fulfilled the selection criteria (i.e. described a conceptual framework or model that sought to measure, assess and / or report on performance in high income healthcare systems). Papers and reports simply using or citing another framework were not retained (Appendix 1).

A bespoke data extraction tool was developed and applied by one author (KS) to capture relevant terms and constructs used in each of the retained frameworks, articles and reports. These were clustered independently by both authors into broad categories – combining related terms such as access, accessibility, access to care, affordability. Any differences in categorisation were resolved via discussion. Using an interpretive review approach [24], the broad categories were critically assessed in terms of underlying assumptions and the extent to which the constructs are directly measurable. Concepts were then categorised according to whether they reflect patient, provider, population or system perspectives. Interdependencies and relationships between the concepts were described through a process of synthesis by individual researchers independently, followed by comparison and resolution; independent resynthesis followed by comparison and resolution. A visual representation was then developed iteratively.

Results

A total of 19 performance frameworks were identified and included in our review (see Additional file 1) [5, 9, 10, 12, 13, 17, 21, 27,27,28,29,30,31,32,33,34,35,38]. The content of each framework was analysed with regards to the performance constructs mentioned. A total of 110 different distinct terms were featured in the frameworks. These were clustered into 17 broad concept groups (see Appendix 2). The most commonly used concepts were ‘appropriateness’ (featured in 19 frameworks), ‘efficiency’ (15), ‘safety’ (14), ‘accessibility’ (13), ‘equity’ (12), ‘impact’ (11) and ‘effectiveness’ (11). In addition to the wide variety of terms or constructs featured in the performance frameworks, there was also variation in the extent to which they included directly measurable or derived or derived constructs. Our mapping exercise identified three approaches that have been used to underpin performance measurement efforts: logic models, theory-based models and goal achievement models.

Typology of frameworks of performance measurement

The first set of frameworks conceptualise performance to be relating inputs, activities, outputs and outcomes. These models look at flows of production [37] and logic models [38, 39]. These models often build on the structures, processes and outcomes of healthcare categorisation proposed by Donabedian [40] where structure describes the settings where care is delivered and the physical, human and financial resources required. Process refers to patient and practitioner activities involved in giving and receiving care and outcome describes the effects of care in terms of changes in health status, patient’s knowledge and/or behaviour; and patient satisfaction.

The second set of frameworks are more theoretically based and conceptualise performance in terms of functions or roles within systems. These models draw on Parson’s theory of social action [41] where performance is seen as achievements in the functions of adaptation, goal attainment, production and values maintenance [42, 43]. In this paradigm, the alignment and balance of these key functions is the primary concern, rather than the actual relationships between inputs and outputs.. It defines adaptation as the ability to secure resources, shape structure, systems and processes according to community needs. Goal attainment is here defined as achievement of targets relating to population health and equity. The production function relates to the quantity and quality of services. Finally, the value maintenance function refers to how systems maintain their capacity and continually develop and evolve.

The third set of frameworks assess performance in terms of societal goal achievement. These models are conceptually agnostic and depend on the definition and codification of a set of values, standards or objectives against which performance is to be judged. These models are grounded in the organisational literature on scientific management, goal setting [44] and management by objectives [45], and consider socially determined goals as core to performance and the strategic orientation required to assess performance. Within this paradigm, assessment of performance involves evaluating the extent to which goals are realised or achieved.

Introducing an integrated measurement framework of performance

Building upon the conceptual antecedents in existing performance frameworks described above, our framework integrates all three of these approaches (See Table 1 and Additional file 2). This new framework builds on a logic model base – but moves away from simplistic ‘counts’ of needs, resources, activities or outcomes - recognising that increases or decreases in any of these do not necessarily correspond to an improvement or deterioration in performance. It also incorporates functional aspects of healthcare systems with derived constructs such as accessibility and effectiveness and safety, and encapsulates goals of the healthcare system such as equity and impact. Crucially, it acknowledges complexity and dynamism and considers performance using a perspective that relates measurable elements to each other (e.g. patients’ needs and expectations with activity) allowing for more meaningful judgements to be made about performance in context and across different time horizons.

Table 1 Measurable, functional and goal-oriented constructs

The integrated framework proposes five measurable constructs (patients’ needs and expectations; healthcare resources and structures; receipt and experience of healthcare services; healthcare processes, functions and context; and healthcare outcomes) which are generally used to populate logic model approaches. These five elements are the aspects of healthcare performance that can be directly measured through quantitative data collection systems or approaches.

Building on these measurable constructs and encapsulating a functional approach, the framework identifies 10 derived constructs of performance (coverage; accessibility; appropriateness; safety; effectiveness; productivity; efficiency; adaptability; sustainability; resilience). The framework also incorporates two overarching derived constructs that relate to goal achievement (population health impact; equity) – recognising their importance in many healthcare systems. Equity is an overarching construct that relates to the population distribution of other constructs, such as access, appropriateness and effectiveness. Similarly, impact is an overarching construct determined by the cumulative contribution of all other constructs. While these are the 12 key constructs of performance, their measurement requires the combination of the previous five measurable constructs as they cannot be measured directly (e.g. to derive efficiency measures, quantification of resources expended is assessed in relation to quantification of outcomes achieved.) Figure 2 illustrates the relationship between measurable and derived constructs of performance.

Fig. 2
figure2

Integrated performance measurement framework

Table 2 summarises the inclusion of constructs in selected frameworks from the literature. From this table we can see very few frameworks are theoretically based but those that are tend to be more comprehensive in construct inclusion.

Table 2 Inclusion of constructs in selected frameworks from the literature

Discussion

Looking across existing performance assessment frameworks, it is clear that there is a core set of constructs that resonate across contexts and jurisdictions. Topics such as access, safety, quality, responsiveness, effectiveness and productivity all feature prominently in the existing literature. However we found very few frameworks that are structured in line with an underlying theory or a conceptual framework that can be applied empirically to capture the entire breadth of constructs required to understand healthcare performance.

We constructed a conceptual framework of healthcare performance measurement that relies on logic, theorisation and mapping. It revealed two key principles. The first principle is that performance is a relative construct – it reflects an outcome in relation to a need; a tangible change in relation to context; a benefit in relation to a cost. While this seems self-evident, it is not always the basis on which healthcare performance is measured and reported. Very often, measurable constructs such as number of hospitalisations, procedures or beds are used to populate indicator sets. More meaningful assessment requires derived metrics that place various elements of healthcare delivery relative to others. For example, it is not possible to directly measure accessibility – rather, patients’ receipt of services has to be measured and considered relative to their needs and expectations.

The second principle is that performance is multi-layered and therefore a contested construct. It is often referred to in terms of ‘value’ or ‘quality’ [26] – notions which can differ across patient, provider, system and population perspectives. Again, this is not a revolutionary assertion but to date, it has not been fully encapsulated in healthcare performance frameworks.

The horizontal axis of the model encapsulates patients’ perspectives – it represents the notion of right care, right time, right way, and right amount. To assess performance from a patient perspective, we develop measures that relate in various combinations: patients’ needs and expectations, services received, and outcomes. The model shows there are two key parallel constructs linking patient related measurable elements. ‘Patient needs and expectations’ and ‘receipt and experience of services’ are bridged by accessibility and appropriateness – reflecting respectively whether any care was received and whether that care was proportionate and tailored to patients’ needs and expectations.

In the central vertical axis, the model considers value from a delivery or organisational perspective – spanning resources and structures (classically referred to as inputs), services provided (outputs); and functions, processes and context (where functions refer to key deliverables such as health promotion, processes refer to priority setting and assurance; and context refers to broader elements such as social determinants of health). The derived constructs are productivity and sustainability.

The framework reveals how apparently similar indicators can in fact reflect different constructs of performance. Indicators are shaped by the measurable components they draw upon and there are multiple combinations or permutations possible. For example, while the number of patients who received a specific intervention is purely a descriptor of receipt of care, once this measure is linked with the actual number of patients requiring this intervention, it then reflects on accessibility, while if it is related to the change in patients’ health status, it is then a measure of effectiveness.

The framework also highlights that there can be reinforcing and antagonistic relationships and feedback loops between constructs that change over time. This may help explain observed unintended consequences of performance reporting [46]. Such consequences may reflect efforts that oversimplify performance assessment by describing what happened in healthcare (e.g. volume and attributes of services provided) but fail to consider these in relation to each other –thereby missing an opportunity to generate understanding (e.g. revealing accessibility by combining the volume of services provided with the number of people requiring it). The derived constructs are logically linked together. Reinforcing relationships exist – for example between efficiency and coverage – where finite resources are not wasted, there is potential for greater coverage; or between the delivery of appropriate care and resulting effectiveness and impact; or between gains in accessibility and the achievement of more equitable healthcare. Reinforcing loops can also act in concert where weak performance in one construct has a dampening effect on other constructs. For example, low efficiency equates to fewer available resources, fewer activities, less coverage; ineffective treatments clearly represent inefficient care; and poor coverage leads to reduced equity and impact.

Conversely, strong performance in one domain can have an antagonistic or detrimental effect on another. For example, over-emphasis on effectiveness can come at a heavy cost - many innovations and new therapies can entail high costs for marginal incremental benefits and therefore might reduce overall efficiency; or high levels of efficiency may only be achieved at the expense of population groups that are more difficult to reach and have less chance of benefitting from treatments, reducing the equity of the system; or maintaining very high levels of appropriateness and responsiveness in some clinical areas might reduce a system’s capacity to ensure a widespread coverage.

The complexity of the interplay between constructs of performance is further heightened when temporality is considered. For example, accessibility that influences performance in a baseline year can affect impact in future years. In a complex dynamic system such as health, maximising the results in any single construct is difficult, if not impossible, to achieve. Even if it were possible, it is not desirable. Given their interdependencies, maximising one construct would likely have unintended consequences on others. Measuring them simultaneously is therefore very important [47].

Towards a ‘measurement system’

The notion of a measurement system has been present in the broader management literature for 50 years [48]. Much of this work clearly differentiates measurement that is for system or performance management purposes from measurement that is for benchmarking and improvement purposes [49]. In health, this is a key distinction with Ministries or Departments of Health often focused on performance management while agencies mandated to secure quality improvement and clinical innovation are more focused on identifying areas of variation in the delivery of healthcare to patients and ways to address them.

The proposed framework can support more comprehensive assessment and also provide transparency about decisions regarding which aspects of performance are measured and reported. Its principal purpose is measurement – although it has clear relevance for quality improvement and for policy.

Healthcare system performance reporting efforts to date have generally featured a preponderance of the simple measurable elements (e.g. utilisation) and a lack of measurement of derived constructs (e.g. organisational functioning). In populating frameworks with metrics, indicators and data, most systems have used a pragmatic approach, focusing on readily available measures and aspects that are relevant to specific policies and contexts, but remain mute about important aspects of performance. In some cases, this has led to sets of disparate indicators that are unable to provide a comprehensive picture of performance. There is a growing recognition that performance measurement efforts should move beyond opportunistic and piecemeal approaches to indicator selection and towards deliberative filling of information gaps [50].

Conceptual clarity does not mean that a handful of measures will do. Over-reliance on a very small number of metrics infers a strong correlation of performance across all constructs, whether measured or not. However, this has not been supported empirically and a number of studies report only weak correlations between different metrics [51,52,53]. In other words, performance in one construct of healthcare is not necessarily informative about performance in others [54].

Overall, we found that most performance frameworks lack a theoretical basis. The lack of theory has perhaps led to a proliferation of measurement scorecards and frameworks that are based on empiricism. This empirical base has meant what is measurable features in many performance framework efforts. More theory or conceptual clarity may mean more parsimony – we don’t need every permutation of measurement to get an understanding of the measurement balance between aspects of performance – what we need is a well-constructed conceptually sound model that can be used for a range of purposes - measurement, quality improvement and policy – and in a range of contexts.

Some frameworks currently in widespread use provide only a partial picture of performance – albeit a critically important one. For example the Institute for Health Improvement’s hugely influential “triple aim” [32] encapsulates elements of appropriateness (experience of care), efficiency (per capita cost) and impact (population health) but it provides a partial view of performance – and while its focus attracts attention with its simplicity, it can be considered to be somewhat reductive – overlooking essential elements of performance, such as accessibility or equity. Simple models can resonate but the trade-off is a loss of sensitivity to complexity and measurement. Paradoxically – more conceptually grounded frameworks may appear piecemeal – because they may have empty categories – our ability to quantify with data may not be advanced sufficiently to fill all the conceptual categories. It does however allow for future proofing and guide data collection and analysis efforts.

Similarly, there is a clear tension between trying to summarise whole health system performance in a ‘single number’ measure and juxtaposing fragmentary metrics that inform about parts of the system [1]. Trying to integrate measures of performance into a single score is likely to prove to be a meaningless task. Embracing the complexity of performance with a framework that enables a clearer assessment and understanding of which data can inform the constructs of performance to be assessed, and how they relate to each other, is more productive than trying to oversimplify performance. For example, understanding trends in the needs of population allows better understanding of why coverage may be decreasing; and reflects on adaptability. In assessing overarching constructs such as equity – sophisticated performance measurement approaches are able to reveal how disparity may be explained by another construct, such as coverage (e.g. unwarranted variation in receipt of surgery by socioeconomic status, as a result of resource allocation that does not match needs for surgery).

Limitations of the framework

The framework provides a clear set of principles for measuring performance. While it is represented by a simple visualisation, it encapsulates considerable complexity and requires substantial effort to be populated with a comprehensive set of indicators. This breadth means that the framework may not be best suited to target efforts or focus a system on key current problems. The metrics highlighted in the paper are quantitative in nature however, ‘soft intelligence’ or experiential evidence is increasingly considered to be an important additional source of performance information - capturing perspectives of actors and sensitivities to context [55]. Because we have distilled the variety of terms used in many frameworks, policy makers and clinicians might be unsettled not to see some classic constructs such as ‘quality’ that have previously been promoted. The review adopted neither a systematic review nor a scoping review method. This was because its purpose was to collect the range and distribution of concepts that have been used to measure healthcare performance and their theoretical basis and the more conceptually grounded approach of Jabareen better aligned with that purpose. Having said that however, the search phase was extensive and seminal reports and papers have been used in a snowball approach to access other key references and ensure an exhaustive review. Finally, because of the salience of performance measurement mostly in high income countries, the framework produced is mostly applicable to these settings and its transposition to other healthcare settings remains to be assessed.

Conclusion

Our proposed framework encapsulates well used concepts seen in many previously published frameworks. While much of it seems familiar, it does involve a fundamental shift in thinking about performance assessment and its conceptualisation - bringing explicit recognition of the complexity and interconnectedness that constitutes performance. The framework provides a means to resolve what sometimes appears to be “indicator chaos” by identifying 12 clearly defined constructs of performance that synthesise over 100 different constructs used previously and can be used to reflect different perspectives and roles.

Focusing on derived aspects of performance drives assessment efforts beyond description. The proposed framework leverages constructs used widely in other frameworks (e.g. accessibility, appropriateness, effectiveness) [18] but where previous efforts have often considered elements of performance in isolation, the framework proposed here uses measures that are dynamic, sensitive to context, and to interlinked processes in healthcare delivery. Moving forward, this approach can help highlight current gaps in indicators and drive the development of measures that truly reflect performance – moving beyond simplicity to insight – combining different pieces of data to develop a more meaningful measure of performance, and one that does not focus solely on outcomes.

Performance as a concept can be beguilingly simple. Similarly, the proposed framework is at first glance, visually simple. However both performance and the framework are multilayered and complex, shaped by actions, reactions and interactions in an interconnected network. Performance is difficult to measure in a meaningful way; requiring scientific rigour and acumen to gauge progress, guide future development and reassure the public.

Reporting with care and rigour is needed to prevent unnecessary damage to professional or organisational reputations. Such damage affects maligned parties but also can undermine the credibility and acceptability of wider efforts to measure and report on performance. When data are used in the public domain, contributing to the democratic process and social choice, there is little room for spurious associations, erroneous assessments or simplistic measures.

Performance assessment in healthcare is a multi-billion dollar effort – healthcare is one of the most important social services provided to citizens around the world. Trust in published information is an essential feature of high quality healthcare. It is imperative both for accountability and for catalysing continuous improvement that we use assessment frameworks that properly reflect performance, lauding achievements and highlighting areas for renewed efforts to change.

Availability of data and materials

All data generated or analysed during this study are included in this published article [and its supplementary information files].

References

  1. 1.

    Papanicolas I, Smith PC. Health system performance comparison: an agenda for policy, information and research. Maidenhead: McGraw-Hill; 2013.

    Google Scholar 

  2. 2.

    Levesque JF, Sutherland K. What role does performance information play in securing improvement in healthcare? A conceptual framework for levers of change BMJ Open. 2017;7(8):e014825.

    PubMed  PubMed Central  Google Scholar 

  3. 3.

    Campanella P, et al. The impact of public reporting on clinical outcomes: a systematic review and meta-analysis. BMC Health Serv Res. 2016;16:296.

    PubMed  PubMed Central  Article  Google Scholar 

  4. 4.

    Fung C, et al. Systematic review: the evidence that publishing patient care performance data improves quality of care. Ann Int Med. 2008;148:111–23.

    PubMed  Article  PubMed Central  Google Scholar 

  5. 5.

    Institute of Medicine. Vital signs: Core metrics for health and health care progress. Washington, DC: The National Academies Press. 2015.

  6. 6.

    Porter M. What is value in health care? N Engl J Med. 2010;363:2477–81.

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  7. 7.

    Porter M, Lee T. From volume to value in health care the work begins. JAMA. 2016;316:1047–8.

    PubMed  Article  PubMed Central  Google Scholar 

  8. 8.

    Ahluwahlia S, Damberg C, Silverman M, et al. What defines a high-performing health care delivery system: a systematic review. Jt Comm J Qual Patient Saf. 2017;43:450–9.

    Article  Google Scholar 

  9. 9.

    Arah OA, Westert GP, Hurst J, et al. A conceptual framework for the OECD Health Care Quality Indicators Project. Int J Qual Health Care. 2006;(Suppl 1):5–13.

    PubMed  Article  PubMed Central  Google Scholar 

  10. 10.

    Murray CJ, Frenk JA. Framework for assessing the performance of health systems. Bull World Health Organ. 2000;78:717–31.

    CAS  PubMed  PubMed Central  Google Scholar 

  11. 11.

    Leatherman S, Sutherland K. Quest for quality. London: The Nuffield Trust; 2008.

    Google Scholar 

  12. 12.

    Canadian Institute for Health Information. Performance framework A Performance Measurement Framework for the Canadian Health System Ottawa: CIHI. 2013.

  13. 13.

    Agency for Healthcare Research and Quality. 2014 National Healthcare Quality and Disparities Report. U.S. Department of Health and Human Services, 2014.

  14. 14.

    Davis K, Stremikis K, Squires D, et al. Mirror, Mirror on the wall, 2014 update: How the U.S. health care system compares internationally. New York: The Commonwealth Fund; 2014.

    Google Scholar 

  15. 15.

    Carinci F, Van Gool K, Mainz J, et al. Towards actionable international comparisons of health system performance: expert revision of the OECD framework and quality indicators. OECD health care quality indicators expert group. Int J Qual Health Care. 2015;27:137–46.

    CAS  PubMed  Google Scholar 

  16. 16.

    van den Berg MJ, Kringos DS, Marks LK, et al. The Dutch Health Care Performance Report: seven years of health care performance assessment in the Netherlands. Health Res Policy Syst. 2014;9:12–1.

    Google Scholar 

  17. 17.

    Productivity Commission. The approach to performance measurement. IN: Review of Government Services (RoGS), 2017 (and previous editions).

  18. 18.

    Braithwaite J, Hibbert P, Blakely B, et al. Health system frameworks and performance indicators in eight countries: a comparative international analysis. SAGE Open Med. 2017;5:1–10.

    Article  Google Scholar 

  19. 19.

    Ham C, Raleigh V, Foot C, et al. Measuring the performance of local health systems: a review for the Department of Health. London: The King’s Fund; 2015.

    Google Scholar 

  20. 20.

    Sicotte C, Champagne F, Contandriopoulos A, et al. A conceptual framework for the analysis of health care organisations’ performance. Health Serv Manag Res. 1998;1:24–41.

    Article  Google Scholar 

  21. 21.

    Atun R, Menadbe N. Health systems and systems thinking. IN: Coker R, Atun R and McKee M [eds], health systems and the challenge of communicable diseases: experiences from Europe and Latin America. Buckingham: Open University Press; 2008.

    Google Scholar 

  22. 22.

    Boulkedid R, et al. Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review. PLoS One. 2011;6:6.

    Article  CAS  Google Scholar 

  23. 23.

    Jabareen Y. Building a conceptual framework: philosophy, definitions and procedure. Int J Qual Methods. 2009;8:4.

    Article  Google Scholar 

  24. 24.

    Dixon-Woods M, Cavers D, Agarwal S, et al. Conducting a critical interpretive review of the literature on access to healthcare by vulnerable groups. BMC Med Res Methodol. 2006;6:35.

    PubMed  PubMed Central  Article  Google Scholar 

  25. 25.

    Schoen K, How S. National scorecard on U.S. health system performance: technical report. New York: The Commonwealth Fund; 2006.

    Google Scholar 

  26. 26.

    Institute of Medicine I. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C.: National Academy Press 2001.

  27. 27.

    England NHS. NHS outcomes framework 2015–16. London: Department of Health; 2015.

    Google Scholar 

  28. 28.

    Veillard J, Champagne F, Klazinga N, et al. A performance assessment framework for hospitals: the WHO regional office for Europe PATH project. Int J Qual Health Care. 2005;17:487–96.

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  29. 29.

    Vrijens F, Renard F, Jonkheer P, et al. The Belgian health system performance report 2012: snapshot of results and recommendations to policy makers. Health Policy. 2013;112:133–40.

    PubMed  Article  PubMed Central  Google Scholar 

  30. 30.

    NHS Scotland. A route map to the 2020 Vision for health and social care. Edinburgh Scotland 2013.

  31. 31.

    Chen L, Wang Y. A conceptual framework for Taiwan’s hospital clinical performance indicators. J Form Med Ass. 2015;114:381–3.

    Article  Google Scholar 

  32. 32.

    Stiefel M, Nolan K. A guide to measuring the triple aim: population health, experience of care, and per capita cost. Institute for Healthcare Improvement: Cambridge, Massachusetts; 2012.

    Google Scholar 

  33. 33.

    Langton J, Wong S, Johnston S, et al. Primary care performance measurement and reporting at a regional level: could a matrix approach provide actionable information for policy makers and clinicians? Healthc Pol. 2016;12:33–51.

    Google Scholar 

  34. 34.

    International Health Partnership and World Health Organization. Monitoring, evaluation and review of national health strategies: a country-led platform for information and accountability. WHO, 2011.

  35. 35.

    Marchal B, Hoeree T, Campos da Silveira v, et al. Building on the EGIPSS performance assessment: the multipolar framework as a heuristic to tackle the complexity of performance of public service oriented health care organisations BMC Public Health 2014;14:378.

  36. 36.

    Hsaio W. What is a health system? Why should we care? Cambridge, Massachussetts: Harvard School of Public Health; 2003.

    Google Scholar 

  37. 37.

    Smith PC. Measuring value for money in healthcare: concepts and tools. London: The Health Foundation; 2009.

    Google Scholar 

  38. 38.

    Knowlton L, Phillips CC. The logic model guidebook: better strategies for great results. Los Angeles: Sage; 2009.

    Google Scholar 

  39. 39.

    Frechtling J. Logic modelling methods in program evaluation. San Francisco: Jossey-Bass; 2007.

    Google Scholar 

  40. 40.

    Donabedian A. The quality of care: how can it be assessed? JAMA. 1988;260:1743–8.

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  41. 41.

    Parsons T. Structure and process in modern societies. New York: Free Press; 1960.

    Google Scholar 

  42. 42.

    Ritzer G. Sociological theory, [3rd ed]. New York: McGraw-Hill; 1992.

    Google Scholar 

  43. 43.

    Adams B, Sydie R. Sociological theory. Pine Forge: Thousand Oaks; 2001.

    Google Scholar 

  44. 44.

    Locke E, Latham G [eds]. New Developments in Goal Setting and Task Performance. New York: Routledge, 2013.

    Google Scholar 

  45. 45.

    Drucker, P. The practice of management. Routledge, 1954.

  46. 46.

    Mannion R, Braithwaite J. Unintended consequences of performance measurement in health care. Intern Med. 2012;42:569–74.

    CAS  Article  Google Scholar 

  47. 47.

    Powell A, White K, Partin M, et al. Unintended consequences of implementing a national performance measurement system into local practice. J Gen Int Med. 2011;27:405–12.

    Article  Google Scholar 

  48. 48.

    Neeley A, Gregory M, Platts K. Performance measurement system design. Int J Op Prod Management. 2005;25:12.

    Google Scholar 

  49. 49.

    Camp R. Benchmarking – the search for industry best practices that lead to superior performance. Milwaukee, WI: ASQS Quality Press; 1989.

    Google Scholar 

  50. 50.

    Meltzer D, Chung J. The population value of quality indicator reporting: A framework for prioritizing health care performance measures. Health Aff. 2014;33:132–9.

    Article  Google Scholar 

  51. 51.

    Wilson IB, Landon BE, Marsden PV, et al. Correlations among measures of quality in HIV care in the United States: cross sectional study. BMJ. 2007;335:1085–91.

    PubMed  PubMed Central  Article  Google Scholar 

  52. 52.

    Rosenthal GE. Weak associations between hospital mortality rates for individual diagnoses: implications for profiling hospital quality. Am J Public Health. 1997;87:429–33.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  53. 53.

    Fischer C, Steverberg E, Fonarow G, et al. A systematic review and meta-analysis on the association between quality of hospital care and readmission rates in patients with heart failure. Am Heart J. 2015;170:1005–17.

    PubMed  Article  PubMed Central  Google Scholar 

  54. 54.

    Profit J, Typpo K, Hysong S, et al. Improving benchmarking by using an explicit framework for the development of composite indicators: an example using pediatric quality of care. Implement Sci. 2010;5:13.

    PubMed  PubMed Central  Article  Google Scholar 

  55. 55.

    Martin G, McKee L, Dixon-Woods M. Beyond metrics: utilizing ‘soft intelligence’ for healthcare quality and safety. Soc Sc Med. 2015:19–26.

    Article  Google Scholar 

  56. 56.

    Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol: Theory Pract. 2005;8(1):19–32.

    Article  Google Scholar 

  57. 57.

    Levac D, Colquhoun H, O’Brien K. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69.

    PubMed  PubMed Central  Article  Google Scholar 

  58. 58.

    Bureau of Health Information. Healthcare in Focus 2014: How does NSW compare? Sydney (NSW): BHI, 2015.

Download references

Acknowledgements

The authors would like to acknowledge the contribution of Lisa Corscadden, from the Bureau of Health Information (BHI) in New South Wales, to the scoping of the international literature and reports. A previous iteration of the framework was published in BHI reports, when the authors were employed by that organisation.

Funding

No external source of funding.

Author information

Affiliations

Authors

Contributions

JFL conceptualised the distinction between measurable and derived performance constructs. KS conducted the literature searches and led the mapping processes. Both authors made significant contributions in categorising constructs, developing the conceptual framework, drafting and editing. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Jean-Frederic Levesque.

Ethics declarations

Ethics approval and consent to participate

No subjects were interviewed, observed or exposed to any interventions are part of this study. No ethical review was sought by the authors for this literature synthesis.

Consent for publication

not applicable.

Competing interests

The authors are full time employees of NSW Health. The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Appendices

Appendix 1

Fig. 3
figure3

Inclusion and exclusion flowchart

Appendix 2

Table 3 Clustering of terms used in the literature according to the framework’s constructs

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Levesque, J., Sutherland, K. Combining patient, clinical and system perspectives in assessing performance in healthcare: an integrated measurement framework. BMC Health Serv Res 20, 23 (2020). https://doi.org/10.1186/s12913-019-4807-5

Download citation

Keywords

  • Performance measurement
  • Conceptual framework
  • Quality improvement