Skip to main content

The impact of public performance reporting on cancer elective surgery waiting times: a data linkage study

Abstract

Background

Excessive waiting times for cancer elective surgery are a concern in publicly funded healthcare systems. Several countries including Australia have introduced healthcare reforms involving time-based targets and public performance reporting (PPR) of hospital data. However, there is mixed evidence of their benefits. We sought to examine the impact of targets and PPR of cancer elective surgery waiting times on access to breast, bowel and lung cancer elective surgery.

Methods

We analysed routinely-collected linked data on admissions and waiting times for patients aged 15 years or over (n = 199,885) who underwent cancer surgery in a public hospital in Victoria, Australia over a 10-year period. We conducted difference-in-differences analyses to compare waiting times before (2006–07 to 2011–12) and after (2012–13 to 2015–16) the introduction of PPR in meeting these targets.

Results

Across all cancer types, urgent patients were all treated within 30 days before and after PPR. Following PPR, there was a slight increase in the mean waiting times across all cancer types and urgency categories. Patients with lung cancer waited on average two and half days longer for treatment and patients with breast cancer waited on average half-a-day less. There was no effect of PPR on waiting times for patients with bowel cancer across urgency categories.

Conclusions

Our findings suggest that time-based targets and PPR had minimal impact on surgical waiting times. This may be due to reasonable waiting times prior to PPR, improved efficiency being masked by 20% growth in the population, lack of public knowledge that waiting times are publicly reported, or lack of real-time reporting to drive behavioural change. The use of generic elective surgery recommended waiting time measures for cancer is discussed.

Peer Review reports

Background

Effective cancer care relies on timely access to treatment. For patients, delays in accessing needed treatment may worsen health outcomes, impair quality of life, induce anxiety and distress, and prolong sick leave and loss of income [1, 2]. For the public, delays in access can exacerbate health inequities, and lead to media and political pressure to address barriers to care [3, 4]. To improve the timeliness of elective surgery for cancer and other conditions, many countries have introduced healthcare reforms involving national targets and performance measures [5]. Previous studies have shown that patients with cancer support public reporting of care quality, timeliness, and patient satisfaction/experience [6, 7].

Waiting time for elective surgery is one of the most common measures of healthcare system performance. To improve the quality of hospital services, several countries publish national waiting times on websites, by specialties and procedures, using hospital administrative data [8]. Some countries, such as England, report at hospital and surgeon-levels and publish patient narratives [9, 10]. Public performance reporting (PPR) of hospital data is thought to improve quality of care by informing patients’ choice of provider, or by prompting organisations to address areas of underperformance [11]. Waiting times are generally measured as the time between the specialist placing the patient on the waiting list and the date of surgery. The wait between a general practitioner’s (GP) referral and the initial specialist appointment is not included, giving rise to a ‘hidden’ waiting list [12]. To better capture waiting times across the whole patient journey, some Nordic countries seek to measure the time between a GP’s referral and the date of surgery [13].

International evidence suggests that elective surgery targets can reduce waiting times and improve access over time [14]. The evidence is stronger in England [15, 16]. Ambitious waiting time targets with rewards and sanctions, increased funding and robust performance management, were cited as key factors contributing to the English success [14, 17]. Despite the reduction in waiting times, there was variation in waiting times between specialties, operative procedures and hospitals [18]. For PPR, there was a lack of evidence to suggest that publishing elective surgery waiting times motivated changes in providers’ behaviour to reduce waiting times or encouraged patients to choose providers based on PPR [19]. Targets and PPR had unintended consequences such as increased pressure on clinicians [20] and gaming of performance data [19, 21].

Previous research on waiting times has focused on certain specialities or all elective surgery [22,23,24]. Few studies have addressed delays in accessing cancer care [1, 2], so the impact of targets and published waiting times for elective surgery for cancer remains uncertain. Given the significant burden of morbidity and mortality associated with cancer [25], and concerns about the fragmentation of cancer care, some countries have developed specific strategies for improving access to cancer care [26]. These include collecting and monitoring waiting times from referral to first treatment for cancer. For example, England introduced targets for maximum waiting times across the cancer care pathway: a specialist appointment within 14 days of GP’s referral (93% target); first treatment within 31 days of diagnosis (96% target); and first treatment within 62 days of GP’s referral (85% target) [27]. There has been limited exploration of these issues in other international contexts including Australia.

In 2011, the Australian government introduced national elective surgery targets and PPR of hospital data, as part of a suite of national healthcare reforms to improve health outcomes. The objectives of the reforms were to optimise hospital care quality and timeliness for Australians [28]. State and Territories governments were offered financial rewards if certain national elective surgery targets were reached [29]. Implementation of the 2011 reforms were the responsibility of States and Territories, with the Commonwealth government responsible for PPR. The national elective surgery targets, underpinned by the Performance and Accountability Framework [30], included: elective surgery waiting times by urgency category (i.e. urgent - within 30 days, semi-urgent - within 90 days and non-urgent – within 12 months) and waiting times for cancer care. The MyHospitals website was established in 2010. All public hospitals were mandated to publicly report their performance on the MyHospitals website [31]; PPR is voluntary for private hospitals. The indicators publicly reported for breast, bowel and lung cancer at the time of the study included elective surgery waiting times by urgency and the number of patients treated within recommended elective surgery waiting times by urgency.

Victoria is the second most populous state in Australia; during the study period its population grew from 5.13 to 6.24 million – an increase of over 17% [32]. Australia has a universal healthcare system publicly funded through the Medicare scheme [33], including free access to treatment in public hospitals. A typical treatment pathway for cancer involves initial assessment by a GP or emergency department (ED) physician followed by referral to a specialist in either the public or private healthcare sectors. Acute conditions (such as a bowel cancer causing bowel obstruction) are treated immediately. Otherwise, patients are booked for ‘elective’ treatment or placed on a waiting list. Waiting times for publicly funded healthcare services arise when the demand for services exceeds the available supply. Population growth, an ageing population, increasing prevalence of chronic disease, funding constraints, and workforce shortages all contribute to this imbalance [34]. In the absence of price-rationing within the public system, waiting lists based on clinical need act as a form of non-price rationing to balance the demand for, and the supply of, health services. Around 40% of Australians choose to purchase private healthcare, or self-fund private care, which enables them to “skip the queue” in the public system by accessing elective surgery in private hospitals [35].

To our knowledge, our study is the first to examine the impact of the introduced targets and of PPR on cancer surgery waiting times in Victoria, Australia, principally in the public hospital system. Linkage of hospital admissions and elective surgery waiting time data provided an opportunity to examine whether the introduction of targets and PPR led to improvement in cancer elective surgery access during the 2006–2016 period, as measured by cancer waiting times indicators reported on the MyHospitals website. We sought to address the following research question: Does the introduction of PPR reduce waiting times for cancer types in which elective surgery waiting times are publicly reported, as compared to other cancer types in which elective surgery waiting times are not publicly reported?

Methods

Study design

The study involved a quasi-experimental research design using linked hospital admissions and elective surgery waiting time data to understand the effect of targets and PPR on cancer elective surgery waiting times. The pre-PPR period was defined as 2006–07 through 2010–11, and the post-PPR period as 2011–12 through to 2015–16.

Data sources

Victorian Admitted Episodes Dataset

The Department of Health and Human Services mandates all Victorian hospitals to report patient admission activity to the Victorian Admitted Episodes Dataset (VAED) under the Health Services Act 1988 [36]. Data collection began in 1979 and continues to the present day with some changes to meet updated national reporting requirements. The VAED includes demographic and clinical information for each admitted episode of patient care, with clinical information coded according to the International Statistical Classification of Diseases and Related Health Problems, Tenth Revision, Australian Modification and the Australian Classification of Health Interventions [37].

The Elective Surgery Information System

The Elective Surgery Information System (ESIS) was introduced in 1997 to monitor access to elective surgery. Victorian public hospitals are required to provide episode-level elective surgery waiting list information to the Department of Health and Human Services. ESIS includes demographic, waiting times and other characteristics of elective surgery for each episode of patient care. Clinical information such as diagnosis are not captured in ESIS, and therefore linkage to VAED is required to identify patients with cancer. Episode-level data was considered an independent observation as multiple waiting list entries for a patient are acceptable if the procedures are independent of each other. Over the 10-year period, 80% of patients (n = 120,920) had one operation and 20% (n = 30,511) had multiple operations for cancer.

Inclusion and exclusion criteria

We selected all public hospital discharge episodes for persons aged 15 years or more, admitted for bowel, breast, lung or other cancer elective surgery between July 2006 and June 2016. Principal cancer diagnosis and procedures were identified using the diagnosis and procedures codes (Additional file 1: Appendix 1). Mapping of the diagnosis (2006 to 2016 edition) and procedure (5th to 9th edition) codes was conducted to ensure consistent coding over time. ‘Admission type’ and ‘Diagnosis Related Groups type’ were limited, respectively, to ‘admission from the waiting list’ and ‘surgical’. We excluded admissions classified as clinical urgency category 3 (i.e. admission within 365 days) due to the small number of cases. Waiting times for patients who were yet to receive their surgery at the end of the study period were not included.

Outcome

The outcome of interest was total waiting time for cancer elective surgery, defined as the number of days between being placed on a waiting list for surgery and the date of admission for that surgery. This does not include the waiting time between a GP’s referral and the initial specialist appointment.

Data linkage

Deterministic data linkage of VAED and ESIS records for the period from 2006 to 07 to 2015–16 was undertaken by the Centre for Victorian Data Linkage (CVDL) [38]. Cases were extracted from the VAED using the inclusion criteria and then linked to ESIS using Medicare number and suffix, date of birth, sex, and hospital unit record number. The de-identified linked dataset was provided for analysis by CVDL.

Statistical analysis

We compared demographic characteristics of patients with breast, bowel, lung or other cancer before and after PPR. A total treatment group was created by combining patients with breast, bowel or lung cancer together. We conducted difference-in-differences (DID) analyses to compare the change in total waiting times after the introduction of PPR for each cancer type and the total treatment group. DID analysis is a commonly used empirical technique in quasi-experimental studies to evaluate the impact of a policy as it measures the change in an outcome before and after an intervention between treatment and control groups, then subtracts one from the other to see the ‘difference in the differences’ [39]. DID analysis is usually implemented as an interaction term between intervention and time in a regression model. An assumption of DID analysis is parallel trends, in which the pre-intervention trends in outcomes are the same between the treatment and control groups. Visual inspections of the pre-treatment trends for the treatment and control groups were conducted (Additional file 1: Appendix 2). Breast, bowel and lung cancer were considered the treatment group because their waiting time indicators were publicly reported on the MyHospitals website, and other cancers were the control group (their waiting time indicators were not publicly reported). Each model was adjusted for sex, age group, marital status, preferred language, patient region, patient type, urgency, surgical specialty, length of stay, and hospital peer group. A hospital peer group consists of similar hospitals based on shared characteristics such as hospital size, service provision and geographical location, thus enabling hospitals to be compared to other similar hospitals to ensure valid comparisons are made [40]. A p-value of less than 0.05 was considered statistically significant in all analyses. Data analyses were conducted using Stata version 14.

Ethical considerations

Ethical approval for the study was granted by the Melbourne School of Population and Global Health Human Ethics Advisory Group, The University of Melbourne.

Results

Patient characteristics

Descriptive statistics of patient characteristics by cancer types before and after PPR are presented in Table 1. Over the 10-year period there were 199,885 cancer elective surgeries. Of those, 4.6% (n = 9104) surgeries were for bowel, 8.5% (n = 16,926) for breast, and 1.5% (n = 3037) for lung cancers. Males accounted for the majority of bowel and lung cancer surgeries. Bowel and lung cancer surgeries were most common among those aged 70–74 years, while breast cancer surgeries commonly occurred among those aged 50–54 years. The majority attended a metropolitan area hospital. Patients with bowel and breast cancers were commonly treated in large hospitals (public acute group A hospitals) and patients with lung cancer in major referral hospitals. The mean length of stay was the shortest for breast cancers, followed by lung and bowel cancers.

Table 1 Demographic characteristics of patients by cancer types before and after public performance reporting (N = 199,885)

Waiting times

The mean and median cancer elective surgery waiting times by urgency category and cancer types are presented in Table 2. Following PPR, there was a slight increase in the mean and median waiting times across all cancer types and urgency category. The largest increase in waiting times occurred in lung cancer.

Table 2 Cancer elective surgery waiting times (in days) by urgency category and cancer type

Treatment within recommended time

The proportion of discharged patients treated within recommended elective surgery waiting times by urgency category and cancer types are presented in Table 3. Across all cancer types, patients classified as urgent were all treated within 30 days before and after PPR. Similarly, the majority of patients classified as semi-urgent were treated within 90 days for all cancer types; although there was a small decrease in the proportion of patients with bowel or breast cancer treated within 90 days following PPR. However, this was not statistically significant following the conduct of chi-square tests.

Table 3 Patients treated within recommended elective surgery waiting times by urgency category and cancer type

Difference-in-differences analyses

Table 4 presents the results of the DID models comparing the change in waiting times to treatment for breast, bowel and lung cancers, the total treatment group with other cancers, stratified by urgency category, and adjusted for sex, age group, marital status, preferred language, patient region, patient type, surgical specialty, length of stay and hospital peer group. There was an effect of PPR on waiting times to treatment for breast and lung cancers for urgent cases but not for semi-urgent cases. Patients with breast cancer waited on average half a day less for treatment than patients with other cancer types following PPR. In contrast, patients with lung cancer waited on average two and half days more for treatment than patients with other cancer types following PPR. There was no effect of PPR on waiting times to treatment for bowel cancer across urgency category. For the total treatment group, there was an effect of PPR on waiting times to treatment for urgent cases but not for semi-urgent cases. Patients with breast, bowel or lung cancer waited on average a quarter of a day less than patients with other cancer types following PPR.

Table 4 Adjusted difference-in-difference model regression results for effect of public performance reporting on cancer elective surgery waiting times stratified by urgency category

Discussion

There are several possible explanations for the limited change in waiting times following PPR. First, lack of public knowledge that waiting times are publicly reported may have prevented the use of PPR for quality improvement. Previous studies have shown that patients and providers, including medical officers and GPs, are generally not aware of PPR data and are unclear what it is [41,42,43,44]. Second, the MyHospitals website does not provide “real-time” data and the data are not reported by diagnosis. The most recent time-period data published on the MyHospitals website were 2012–13 for cancer elective surgery waiting times and 2017–18 for elective surgery waiting times. Although the latest elective surgery waiting times were reported; data are provided by urgency category, specialty of surgeon, and intended procedure, but not diagnosis. This means that neither patients nor providers were able to access timely information about cancer-specific waiting times on the performance reporting website. As a result, there were limited opportunities for patients or providers to change their behaviour in response to such data.

Third, patients with breast, bowel or lung cancer waited on average 13 days for urgent care and 40 days for semi-urgent care, well below the recommended 30 and 90 days, respectively. The availability of relatively timely care within the Victorian jurisdiction means that pressure for change following PPR may have been less than in jurisdictions with unacceptably long waiting times. Notably, there was a slight increase in waiting times for patients with lung cancer following PPR which likely reflected the marked increase in lung cancer incidence rate among females in Victoria [45] and Australia [46]. This is likely to be attributed to the smoking pattern in the past decades (the prevalence of smoking in female peaked in the mid-1970s). Other factors which may explain the continued increase in lung cancer incidence rate include the changing composition of modern cigarettes and changes in age and size of the population.

Fourth, the population of Victoria grew by more than 1.1 million people during the study period. It is possible that the PPR led to improvements in efficiency within the healthcare system which were masked by the significant increase in demand associated with the rapidly growing population. As of 30th July 2020, there were over 56,000 patients on the waiting list across all urgency categories [47]. It was not possible to ascertain how many cases were cancer-related as ESIS records intended procedure and not patient diagnosis. Knowledge of diagnosis requires linkage to VAED following an admission to hospital [36]. As such, the data included only patients with cancer who were admitted to hospital for surgery and excluded those who may still be on the waiting list at the end of the study period or dropped out from the waiting list (e.g. surgery was no longer required, they passed away) during the study period. Further research is warranted to investigate the demand of cancer elective surgery and the capacity of the healthcare system to deliver timely cancer services.

Despite achieving the elective surgery waiting time targets, it is unclear whether the waiting times for cancer treatment are clinically appropriate given that they are prioritised using the same waiting list system as other elective surgeries such as hip replacement. Setting a generic waiting time for all cancer surgeries has been criticised for not considering the complexity of the disease, the phases of cancer care and the different treatment modalities of the various cancer types [48]. International guidelines in the UK for acceptable waiting times focus on the following phases of cancer care: days between GP’s referral and first specialist appointment; days between GP’s referral and diagnosis; days between decision to treat and start of treatment; and days between GP’s referral and start of treatment [27]. In Australia, Cancer Council Australia have introduced optimal cancer care pathways for 15 cancer types, with variation in recommended waiting times across cancer care phases [49]. There is an opportunity to capture and report waiting time indicators by diagnosis across the entire patient journey by using linked primary care and hospital data. This will help identify areas for improvement in care co-ordination, clinical practice and health services delivery at each stage of the cancer journey to be identified. Linkage of primary care and hospital data for cancer care is currently underway in Victoria, Australia, but collection of primary care data for linkage is not yet widespread [50,51,52].

Waiting times to elective surgery is one of the most common generic health service ‘quality’ measures publicly reported, alongside length-of-stay, complications and mortality. These measures are relatively simple to collect with the use of administrative data but provide limited insights into the quality of healthcare delivery. De-identified linkage of cancer registries with administrative data (e.g. electronic medical records) could provide a more complete picture of the quality of cancer care and health system performance [53]. Using these data, Spinks et al. [48] proposed collecting and reporting the following meaningful quality indicators across the cancer care continuum: outcomes (e.g. recovery, functional restoration, and survival); structure (e.g. physical facilities, nurse-to-patient ratios); process (e.g. screening, prevention, diagnosis and staging); efficiency (e.g. adherence to guidelines); cost of care (e.g. direct and indirect costs); and patients’ perceptions of care (e.g. satisfaction).

Strengths and limitations

The study included state-wide population coverage of cancer elective surgery admissions over a period of 10 years. Despite the large time-period and a suitable comparator group, the findings should be interpreted in the context of several study limitations. A prerequisite of DID analysis is finding a control group for which the parallel trends assumption is met, in which the pre-intervention trends in outcomes are the same between the treatment and control groups. Ideally, the only difference between the two groups would be exposure to the policy. In practice, such a group may be difficult to find. The characteristics of the control group differ slightly from each of the intervention group but their trends in pre-treatment outcomes followed a similar trajectory. As such, selection bias may exist, and the results should be interpreted with caution.

The Australian healthcare reforms included several hospital and related care policies not limited to reducing waiting times for elective surgery and making service performance information publicly available [28]. As such, we were unable to disentangle the effect of confounding influences on cancer elective surgery waiting times, thus making it difficult to attribute the observed changes to a specific policy. Furthermore, it was unclear what quality improvement initiatives (if any) were implemented in the hospitals following the introduction of PPR that would drive or impede improvement in cancer elective surgery waiting times. Further research is required to investigate the causal pathways in which PPR influence waiting times via quality improvement processes.

There has been some evidence of manipulation of the elective surgery waiting list in Victoria, Australia [54, 55]. Patients classified as urgent or semi-urgent whose waiting times for surgery were approaching the target for their category were reclassified as “not ready for care – patient initiated”. This ensured that category waiting time targets were not exceeded and that the hospital met elective surgery key performance indicators [56, 57]. It is unclear how prevalent data manipulation is and whether this influenced our results.

The total elective surgery waiting times does not include the waiting time to see a specialist following a GP’s referral, which underestimates the total waiting times across the cancer care continuum, masking the true demands on the public hospital sector and the impact on public patients. There is no national administrative system in Australia that captures data on how many patients are waiting for these appointments, nor how long they have waited for them; although individual hospitals may collect such information in their internal database as referrals and appointments are dated and documented. Further research is warranted to capture the complete picture of waiting times for cancer elective surgery for policy makers to make fully informed decisions about public hospital service planning, delivery and resourcing.

Improved elective surgery waiting times would not necessarily represent an improvement in patient care; we did not have information on cancer tumour stage and clinical outcomes of patients. Future research is warranted to better understand the relationships between timeliness of care delivery and clinical outcomes. The study included public hospitals and generalisability to the private sector is unknown; however, it is expected that private patients treated in private hospitals would have shorter waiting times [58,59,60]. During the study period, the proportion of Australians with private health insurance increased from 9 million (44% of the population) to 11 million (47% of the population) [61]. Access to the Medicare Benefits Schedule data [62] or the National Hospital Morbidity Database [63] could provide insights into the number of cancer patients who underwent surgical treatment in the private sector. Medicare Benefits Schedule is a list of Medicare services subsidised by the Australian government which includes private patients in public or private hospitals who made a Medicare claim. The National Hospital Morbidity Database includes episode-level records from admitted patient morbidity data collection systems in Australian public and private hospitals. Further research is warranted to explore waiting times differences between public and private patients, and whether publicly reported waiting times in public hospitals influence cancer patients’ decision to seek treatment in the private sector.

Conclusions

Our findings suggest that the introduction of elective surgery waiting time targets and PPR, which enable greater transparency of cancer elective surgery waiting times, had limited impact on patient waiting times in Victoria, Australia. Nonetheless, publicly reporting waiting times may still support patients with cancer to make an informed choice about their surgery. It will give patients the opportunity to pursue other avenues for surgery if made aware of the ‘true’ waiting times. This will be dependent on having ‘real-time’ dashboard/website, supplemented with relevant and meaningful quality indicators across the cancer care continuum for individual cancer types to inform patients’ choice.

Availability of data and materials

The data that support the findings of this study are available from CVDL but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from CVDL for researchers who meet the criteria for access to confidential data. Information related to the process for requesting data is contained on the CVDL website:

https://www2.health.vic.gov.au/about/reporting-planning-data/the-centre-for-victorian-data-linkage

Abbreviations

CVDL:

The Centre for Victorian Data Linkage

DID:

Difference-in-Differences

ED:

Emergency Department

ESIS:

Elective Surgery Information System

GP:

General Practitioner

PPR:

Public Performance Reporting

VAED:

Victorian Admitted Episodes Dataset

References

  1. 1.

    Bilimoria KY, Ko CY, Tomlinson JS, Stewart AK, Talamonti MS, Hynes DL, et al. Wait times for cancer surgery in the United States: trends and predictors of delays. Ann Surg. 2011;253(4):779–85.

    PubMed  Article  Google Scholar 

  2. 2.

    Paul C, Carey M, Anderson A, Mackenzie L, Sanson-Fisher R, Courtney R, et al. Cancer patients' concerns regarding access to cancer care: perceived impact of waiting times along the diagnosis and treatment journey. Eur J Cancer Care. 2012;21(3):321–9.

    CAS  Article  Google Scholar 

  3. 3.

    Karp P. Labor promises $500m to cut public hospital waiting times for cancer treatment. Australia: The Guardian; 2019. Available from: https://www.theguardian.com/australia-news/2019/apr/09/labor-promises-500m-to-cut-public-hospital-waiting-times-for-cancer-treatment

    Google Scholar 

  4. 4.

    Duncan P, Campbell D. Thousands of cancer patients face NHS treatment delays. Australia: The Guardian; 2018. Available from: https://www.theguardian.com/society/2018/aug/09/thousands-of-cancer-patients-face-nhs-treatment-delays

    Google Scholar 

  5. 5.

    Siciliani L, Moran V, Borowitz M. Measuring and comparing health care waiting times in OECD countries. Health Policy. 2014;118(3):292–303.

    PubMed  Article  Google Scholar 

  6. 6.

    Chimonas S, Fortier E, Li DG, Lipitz-Snyderman A. Facts and fears in public reporting: patients’ information needs and priorities when selecting a hospital for cancer care. Med Decis Mak. 2019;39(6):632–41.

    Article  Google Scholar 

  7. 7.

    Yang A, Chimonas S, Bach PB, Taylor DJ, Lipitz-Snyderman A. Critical choices: what information do patients want when selecting a hospital for cancer surgery? J Oncol Pract. 2018;14(8):e505–12.

    PubMed  PubMed Central  Article  Google Scholar 

  8. 8.

    Rechel B, McKee M, Haas M, Marchildon GP, Bousquet F, Blümel M, et al. Public reporting on quality, waiting times and patient experience in 11 high-income countries. Health Policy. 2016;120(4):377–83.

    PubMed  Article  Google Scholar 

  9. 9.

    England NHS. MyNHS. UK: NHS; 2019. Available from: https://www.nhs.uk/service-search/Performance/Search

    Google Scholar 

  10. 10.

    NHS England. Reviews and ratings. UK: NHS; 2019 [Available from: https://www.nhs.uk/Services/Trusts/ReviewsAndRatings/DefaultView.aspx?id=3446.

  11. 11.

    Berwick DM, James B, Coye MJ. Connections between quality measurement and improvement. Med Care. 2003;41(1):1-30.

  12. 12.

    Duckett S. Getting an initial specialists' appointment is the hidden waitlist. Parkville VIC: The Conversation; 2018. Available from: https://theconversation.com/getting-an-initial-specialists-appointment-is-the-hidden-waitlist-99507

    Google Scholar 

  13. 13.

    Siciliani L, Borowitz M, Moran V. Waiting time policies in the health sector: what works? Paris: OECD Publishing; 2013. p. 328.

    Google Scholar 

  14. 14.

    Reddy S, Jones P, Shanthanna H, Damarell R, Wakerman J. A systematic review of the impact of healthcare reforms on access to emergency department and elective surgery services: 1994–2014. Int J Health Serv. 2018;48(1):81–105.

    PubMed  Article  Google Scholar 

  15. 15.

    Besley TJ, Bevan G, Burchardi K. Naming & Shaming: the impacts of different regimes on hospital waiting times in England and Wales; 2009.

    Google Scholar 

  16. 16.

    Propper C, Sutton M, Whitnall C, Windmeijer F. Did'targets and terror'reduce waiting times in England for hospital care? BE J Econ Anal Policy. 2008;8(2):1–27.

    Google Scholar 

  17. 17.

    Willcox S, Seddon M, Dunn S, Edwards RT, Pearse J, Tu JV. Measuring and reducing waiting times: a cross-national comparison of strategies. Health Aff. 2007;26(4):1078–87.

    Article  Google Scholar 

  18. 18.

    Dimakou S, Parkin D, Devlin N, Appleby J. Identifying the impact of government targets on waiting times in the NHS. Health Care Manag Sci. 2009;12(1):1–10.

    PubMed  Article  Google Scholar 

  19. 19.

    Kreindler SA. Policy strategies to reduce waits for elective care: a synthesis of international evidence. Br Med Bull. 2010;95(1):7–32.

    PubMed  Article  PubMed Central  Google Scholar 

  20. 20.

    Anderson B. Cancer management: the difficulties of a target-driven healthcare system. Br J Nurs. 2016;25(9):S36–40.

    PubMed  Article  Google Scholar 

  21. 21.

    Bevan G, Hood C. What’s measured is what matters: targets and gaming in the English public health care system. Public Adm. 2006;84(3):517–38.

    Article  Google Scholar 

  22. 22.

    Viberg N, Forsberg BC, Borowitz M, Molin R. International comparisons of waiting times in health care – limitations and prospects. Health Policy. 2013;112(1):53–61.

    PubMed  Article  Google Scholar 

  23. 23.

    Löfvendahl S, Eckerlund I, Hansagi H, Malmqvist B, Resch S, Hanning M. Waiting for orthopaedic surgery: factors associated with waiting times and patients’ opinion. Int J Qual Health Care. 2005;17(2):133–40.

    PubMed  Article  Google Scholar 

  24. 24.

    Levy AR, Sobolev BG, Hayden R, Kiely M, FitzGerald JM, Schechter MT. Time on wait lists for coronary bypass surgery in British Columbia, Canada, 1991–2000. BMC Health Serv Res. 2005;5(1):22–32.

    PubMed  PubMed Central  Article  Google Scholar 

  25. 25.

    Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2018;68(6):394–424.

  26. 26.

    Aberg L, Albrecht B, Rudolph T. How health systems can improve value in cancer care: McKinsey & Company; 2012. p. 14. https://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/howhealth-systems-can-improve-value-in-cancer-care

  27. 27.

    Bate A, Baker C, Mackley A. NHS cancer targets. UK: House of Commons Library; 2018. 41 p.

    Google Scholar 

  28. 28.

    Council of Australian Governments. National Healthcare Agreement 2011. Canberra ACT: COAG. p. 2011, 19.

  29. 29.

    Council of Australian Governments. The National Health Reform Agreement - National Partnership Agreement on improving public hospital services. Canberra ACT: COAG; 2011. p. 56.

    Google Scholar 

  30. 30.

    Council of Australian Governments. National health reform performance and accountability framework. Canberra ACT: COAG; 2011. p. 31.

    Google Scholar 

  31. 31.

    Australian Institute of Health and Welfare. MyHospitals. Canberra ACT: AIHW; 2020. Available from: http://www.myhospitals.gov.au/

    Google Scholar 

  32. 32.

    Australian Bureau of Statistics. Australian demographic statistics [internet]. Belconnen ACT: ABS; 2020. Available from: https://www.abs.gov.au/AUSSTATS/abs@.nsf/mf/3101.0

    Google Scholar 

  33. 33.

    Department of Human Services. About Medicare [internet]. Canberra ACT: Australian Government; 2019. Available from: https://www.humanservices.gov.au/individuals/subjects/whats-covered-medicare/about-medicare

    Google Scholar 

  34. 34.

    Hunter DJ. Desperately seeking solutions: rationing health care. NY USA: Routledge; 2018. p. 174.

    Google Scholar 

  35. 35.

    Australian Prudential Regulation Authority. Quartely private health insurance statistics. Sydney: APRA; 2019. p. 15.

    Google Scholar 

  36. 36.

    Department of Health & Human Services. Victorian admitted episodes dataset. Melbourne: Victoria State Government; 2019. Available from: https://www2.health.vic.gov.au/hospitals-and-health-services/data-reporting/health-data-standards-systems/data-collections/vaed

    Google Scholar 

  37. 37.

    The Independent Hospital Pricing Authority. ICD-10-AM/ACHI/ACS current edition. Darlinghurt: IHPA; 2019. Available from: https://www.ihpa.gov.au/what-we-do/icd-10-am-achi-acs-current-edition

    Google Scholar 

  38. 38.

    Department of Health & Human Services. The Centre for Victorian Data Linkage. Melbourne: Victoria State Government; 2019. Available from: https://www2.health.vic.gov.au/about/reporting-planning-data/the-centre-for-victorian-data-linkage

    Google Scholar 

  39. 39.

    Dimick JB, Ryan AM. Methods for evaluating changes in health care policy: the difference-in-differences approach. JAMA. 2014;312(22):2401–2.

    CAS  PubMed  Article  Google Scholar 

  40. 40.

    Australian Institute of Health and Welfare. Australian hospital peer groups. Canberra ACT: AIHW; 2015. p. 187.

    Google Scholar 

  41. 41.

    Canaway R, Bismark M, Dunt D, Prang K-H, Kelaher M. “What is meant by public?”: stakeholder views on strengthening impacts of public reporting of hospital performance data. Soc Sci Med. 2018;202:143–50.

    PubMed  Article  Google Scholar 

  42. 42.

    Prang K-H, Canaway R, Bismark M, Dunt D, Kelaher M. The use of public performance reporting by general practitioners: a study of perceptions and referral behaviours. BMC Fam Pract. 2018;19(1):1–11-.

  43. 43.

    Prang K-H, Canaway R, Bismark M, Dunt D, Miller JA, Kelaher M. Public performance reporting and hospital choice: a cross-sectional study of patients undergoing cancer surgery in the Australian private healthcare sector. BMJ Open. 2018;8(4):1–9.

    Article  Google Scholar 

  44. 44.

    Canaway R, Bismark M, Dunt D, Kelaher M. Public reporting of hospital performance data: views of senior medical directors in Victoria, Australia. Aust Health Rev. 2018;42(5):591–9.

    PubMed  Article  Google Scholar 

  45. 45.

    Thursfield V, Farrugia H. Cancer in Victoria: statistics & trends 2016. Melbourne: Cancer Council Victoria; 2017. 64 p.

    Google Scholar 

  46. 46.

    Australian Institute of Health and Welfare. Cancer data in Australia. Canberra: AIHW; 2020. Available from: https://www.aihw.gov.au/reports/cancer/cancer-data-in-australia

    Google Scholar 

  47. 47.

    Victorian Agency for Health Information. Patients waiting for treatment. Melbourne: Victoria State Government; 2020. Available from: https://vahi.vic.gov.au/elective-surgery/patients-waiting-treatment

    Google Scholar 

  48. 48.

    Spinks TE, Walters R, Feeley TW, Albright HW, Jordan VS, Bingham J, et al. Improving cancer care through public reporting of meaningful quality measures. Health Aff. 2011;30(4):664–72.

    Article  Google Scholar 

  49. 49.

    Cancer Council. Optimal cancer care pathways. Australia: Cancer Council; 2019. Available from: https://www.cancer.org.au/health-professionals/optimal-cancer-care-pathways.html

    Google Scholar 

  50. 50.

    Emery J, Boyle D. Data linkage. Aust Fam Physician. 2017;46(8):615–9.

    PubMed  Google Scholar 

  51. 51.

    Emery J. Studying the continuum of cancer care through linking primary data [internet]. Melbourne: The University of Melbourne; 2019. Available from: https://medicine.unimelb.edu.au/research-groups/general-practice-research/cancer-research-group/linkage-of-hospital-and-primary-care-data-to-drive-improvements-in-cancer-care

    Google Scholar 

  52. 52.

    Canaway R, Boyle DI, Manski-Nankervis JAE, Bell J, Hocking JS, Clarke K, et al. Gathering data for decisions: best practice use of primary care electronic records for research. Med J Aust. 2019;210:S12–6.

    PubMed  PubMed Central  Article  Google Scholar 

  53. 53.

    Hiatt RA, Tai CG, Blayney DW, Deapen D, Hogarth M, Kizer KW, et al. Leveraging state cancer registries to measure and improve the quality of cancer care: a potential strategy for California and beyond. J Natl Cancer Inst. 2015;107(5):1–7.

    Article  Google Scholar 

  54. 54.

    SBS News. Vic waiting list omissions to be audited [internet]. Australia: SBS News; 2016. Available from: https://www.sbs.com.au/news/vic-waiting-list-omissions-to-be-audited

    Google Scholar 

  55. 55.

    Tomazin F, Mills T. From gag orders to dubious data: how your hospital keeps you in the dark [internet]. Australia: The Age; 2018. Available from: https://www.theage.com.au/national/victoria/from-gag-orders-to-dubious-data-how-your-hospital-keeps-you-in-the-dark-20180828-p5007a.html

    Google Scholar 

  56. 56.

    Curtis AJ, Stoelwinder JU, McNeil JJ. Management of waiting lists needs sound data. Med J Aust. 2009;191(8):423–4.

    PubMed  Article  Google Scholar 

  57. 57.

    Nocera A. Performance-based hospital funding: a reform tool or an incentive for fraud? Med J Aust. 2010;192(4):222–4.

    PubMed  Article  Google Scholar 

  58. 58.

    Whyte R, Connolly S, Wren MA. Insurance status and waiting times for hospital-based services in Ireland. Health Policy. 2020;124(11):1174–81.

    PubMed  Article  Google Scholar 

  59. 59.

    Australian Institute of Health and Welfare. Private health insurance use in Australian hospitals, 2006–07 to 2015–16: Australian hospital statistics. Canberra: AIHW; 2017. 175 p.

    Google Scholar 

  60. 60.

    Shmueli A, Savage E. Private and public patients in public hospitals in Australia. Health Policy. 2014;115(2–3):189–95.

    PubMed  Article  Google Scholar 

  61. 61.

    Australian Institute of Health and Welfare. Private health insurance [internet]. Canberra: AIHW; 2021. Available from: https://www.aihw.gov.au/reports/australias-health/private-health-insurance

    Google Scholar 

  62. 62.

    Australian Institute of Health and Welfare. Medicare benefits schedule (MBS) data collection [internet]. Canberra: AIHW; 2021. Available from: https://www.aihw.gov.au/about-our-data/our-data-collections/medicare-benefits-schedule-mbs

    Google Scholar 

  63. 63.

    Australian Institute of Health and Welfare. National hospitals data collection. Canberra: AIHW; 2021. Available from: https://www.aihw.gov.au/about-our-data/our-data-collections/national-hospitals-data-collection

    Google Scholar 

Download references

Acknowledgements

We wish to thank CVDL for providing access to the linked VAED and ESIS.

Funding

The research was funded by Medibank Better Health Foundation. Views expressed are those of the authors and not the funding agency. The funding agency had not role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Affiliations

Authors

Contributions

MK, DD and MB conceptualised and designed the study and obtained its funding. KP performed the statistical analyses, interpretation of the data and drafted the manuscript. RC, MB, DD, JM and MK contributed to data interpretation and critically reviewed the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Khic-Houy Prang.

Ethics declarations

Ethics approval and consent to participate

Ethical approval for the study was obtained from the Melbourne School of Population and Global Health Human Ethics Advisory Group, the University of Melbourne. CVDL granted access to the de-identified linked dataset for this study. CVDL operating model is governed by a number of Acts and Regulations such as the Victorian Health Records Acts 2001 and the Privacy and Data Protection Act 2014. This allows approved researchers to securely access de-identified data for research purposes, without seeking individual’s consent.

Consent for publication

Not applicable.

Competing interests

Professor Margaret Kelaher is a member of BMC Health Services Research Editorial Board.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Appendix 1

Cancer diagnosis and procedure codes. Appendix 2 Visual inspection of parallel trends in the pre-intervention period

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Prang, KH., Canaway, R., Bismark, M. et al. The impact of public performance reporting on cancer elective surgery waiting times: a data linkage study. BMC Health Serv Res 21, 129 (2021). https://doi.org/10.1186/s12913-021-06132-w

Download citation

Keywords

  • Public reporting
  • Waiting times
  • Cancer
  • Surgery
  • Data linkage