Skip to main content
  • Research article
  • Open access
  • Published:

Development, implementation and evaluation of an evidence-based program for introduction of new health technologies and clinical practices in a local healthcare setting



This paper reports the process of establishing a transparent, accountable, evidence-based program for introduction of new technologies and clinical practices (TCPs) in a large Australian healthcare network. Many countries have robust evidence-based processes for assessment of new TCPs at national level. However many decisions are made by local health services where the resources and expertise to undertake health technology assessment (HTA) are limited and a lack of structure, process and transparency has been reported.


An evidence-based model for process change was used to establish the program. Evidence from research and local data, experience of health service staff and consumer perspectives were incorporated at each of four steps: identifying the need for change, developing a proposal, implementation and evaluation. Checklists assessing characteristics of success, factors for sustainability and barriers and enablers were applied and implementation strategies were based on these findings. Quantitative and qualitative methods were used for process and outcome evaluation. An action research approach underpinned ongoing refinement to systems, processes and resources.


A Best Practice Guide developed from the literature and stakeholder consultation identified seven program components: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aims of transparency and accountability were achieved. The processes are explicit, decisions published, outcomes recorded and activities reported. The aim of ascertaining rigorous evidence-based information for decision-making was not achieved in all cases. Applicants proposing new TCPs provided the evidence from research literature and local data however the information was often incorrect or inadequate, overestimating benefits and underestimating costs. Due to these limitations the initial application process was replaced by an Expression of Interest from applicants followed by a rigorous HTA by independent in-house experts.


The program is generalisable to most health care organisations. With one exception, the components would be achievable with minimal additional resources; the lack of skills and resources required for HTA will limit effective application in many settings. A toolkit containing details of the processes and sample materials is provided to facilitate replication or local adaptation by those wishing to establish a similar program.

Peer Review reports


New health technologies and clinical practices (TCPs) are defined as therapeutic interventions or diagnostic procedures that are considered by a reasonable body of clinical opinion to be significantly different from existing clinical practice. Therapeutic interventions include prostheses, implantable devices, vaccines, pharmaceuticals and medical, surgical or other clinical procedures [1].

Australia has robust evidence-based processes for assessment of new health technologies, clinical practices and medications through the national Medical Services Advisory Committee (MSAC) and Pharmaceutical Benefits Advisory Committee (PBAC). Although these processes are rigorous and provide trustworthy information, they do not address all the requirements of healthcare decision-makers. MSAC decisions only cover therapeutic and diagnostic procedures provided by doctors; they do not include activities of nursing and allied health professionals, models of care or service delivery. PBAC decisions only consider pharmaceuticals for community use and do not include some therapeutic agents used solely in the hospital context. Not all topics being considered by decision-makers have been addressed in national recommendations and central agencies cannot take into account local factors such as population needs, organisational priorities, budgets, capacity or capability. Hence many decisions about the use of TCPs have to be made at the state, regional and hospital levels. At national level, evidence-based assessment and development of recommendations for application and funding of new TCPs is enabled by rigorous processes, underpinned by appropriate resources and expertise. However at the local level limitations in processes, resources and expertise means that decision-making is undertaken with varying degrees of rigour, structure and transparency [24].

Monash Health (previously Southern Health) is a large health service network providing primary and secondary care in the south east of Melbourne and tertiary and quaternary care in specialist areas across the state of Victoria, Australia. In 2000, Monash Health established the first Technology/Clinical Practice Committee (TCPC) in Victoria to assess TCPs prior to their introduction. Since then, a number of factors have influenced health agencies around the world in how they approach assessment of new TCPs. These global challenges arise from rapid advances in health technologies; consumers’ desires to be well informed and participate in decision-making; imperatives for transparent, accountable and evidence-based decision-making (EBDM); and the need to get best value from finite or decreasing resources.

Although early leaders in this area, the TCPC acknowledged that there were opportunities for improvement. Limitations of the Monash Health system included inadequate transparency and lack of explicit criteria in decision-making, lack of high quality information for decision-making, meetings called at short notice resulting in lack of representative views and inadequate preparation time, lack of awareness of requirements for applications, need for improved accessibility of application materials and limited resources to monitor newly introduced TCPs. The need for more rigorous processes to ensure safe introduction of new TCPs was identified as a priority for the organisation.

This project was undertaken by the Centre for Clinical Effectiveness (CCE), an in-house ‘Evidence Based Practice Hospital Support Unit’ providing expertise in evidence synthesis, implementation and evaluation [5].


The aim of the project was to establish a sustainable, transparent, accountable and evidence-based program for introduction of new TCPs in the local healthcare setting.

This paper aims to outline the development, implementation and evaluation of the program.

A toolkit for introduction of new TCPs in hospitals and health care organisations has also been developed which aims to assist health service staff to establish similar programs by providing detailed descriptions of the components, templates of useful documents and links to resources [Additional file 1].

Research questions

What is best practice for introduction of new TCPs?

How can best practice be implemented most effectively?

What are the outcomes of implementation and the factors for success?



This project was undertaken using the SEAchange model for sustainable, effective and appropriate change in health services [6]. The model involves four key steps: identifying the need for change, developing a proposal to meet the need, implementing the proposal and evaluating the extent and impact of the change. Each step is underpinned by principles of EBDM to ensure that the best available evidence from research and local data, the experience and expertise of health service staff and the values and perspectives of consumers are taken into account. Factors related to sustainability, avoidance of duplication and integration with existing systems are explicitly considered. Adaptation of the model for this project is outlined in Fig. 1.

Fig. 1
figure 1

Four step model for evidence-based process change

Mixed methods were used for process and outcome evaluation. Quantitative methods included audit, surveys and document analysis. Qualitative methods included workshops, individual and group discussions and feedback forms.

An action research approach was adopted based on the ‘researcher as facilitator for change’ model defined by Meyer; researchers working explicitly with and for people rather than undertaking research on them [7, 8]. In this capacity, CCE staff took on the roles of TCPC Executive Officer and Administrative Officer during the development and revision phases of the project. Observations and reflections of the project team and committee members were used for ongoing improvements to the program components and implementation process. Consideration of ‘what worked, what didn’t, why and how it could be improved’ was used throughout.

A timeline for the project is included in the toolkit [Additional file 1].


Three stakeholder groups participated in the design, implementation and evaluation of the TCP program.

  1. 1.

    ‘Decision-makers’ were members of the TCPC including an Executive Sponsor; representatives with expertise in operations, finance, evidence-based practice, ethical and legal considerations; clinical program directors and health service consumers. Further details are available in the Terms of Reference [Additional file 1: Appendix 7].

  2. 2.

    ‘Administrators’ were health service staff managing the processes related to making, implementing, monitoring and reporting decisions related to introduction of new TCPs and specialist staff who provided expertise to assist applicants in use of evidence (CCE) and health service utilisation data (Clinical Information Management), coding (Health Information Services), credentialing and scope of practice (Medical Support Unit) and development of business cases (Finance and Business Managers).

  3. 3.

    ‘Applicants’ were clinicians (medical, nursing or allied health) or clinical managers who were seeking authorisation to introduce a new TCP.

Data collection

Data were collected in an ongoing process over two years and methods were designed to minimise the time and effort required of participants. Scheduled meetings of the TCPC were used for formal workshops and informal group discussions with the decision-makers. Informal interviews with administrators were undertaken during routine meetings or by appointment. The applicants were clinicians based across a number of campuses who found it difficult to attend additional meetings, hence a range of options for individual feedback was provided (details are noted below and examples are provided in Additional file 1).

Discussion papers, background documents and formal presentations were prepared for workshops. All group and individual meetings had an agenda which included the topics for discussion and decisions required.


Proposals for program design, implementation strategies and evaluation plans were drafted by the project team based on findings from the literature and local research. These were refined based on stakeholder feedback. Decisions were made by the TCPC, discussion was informal and decisions were based on consensus.

Step 1: Identify the need for change

Needs assessment

The views of decision-makers and administrators were sought in group and individual discussions.

To capture feedback from previous and potential applicants, recent users of the existing system were contacted personally and a generic invitation to provide feedback was circulated via the ‘Senior Medical Staff’, ‘All Managers’ and ‘Department Head’ email lists. Email, phone and face-to-face responses were accepted.

Although the four stages in this model are sequential (Fig. 1), change processes are not always linear and often require iterative changes to decisions made in earlier steps. Additional needs were elicited during the implementation and evaluation steps using action research methods including feedback sheets on pilot documents and applicant’s responses to invitations to provide feedback and reflections and observations of the committee and project team.

Literature review

A review of international practice to identify key principles for a TCP Program was undertaken. It was anticipated that guidance on development of a systematic approach to governance and decision-making in a health service would be found in policy documents, reports, government publications and research studies. All publication types were eligible and would be included if they addressed methods, processes, recommendations or guidance for introduction of new TCPs.

Finding appropriate search terms for use in health databases was problematic. Broad searches returned too many titles to process. Narrower searches failed to find any relevant literature and it was unclear whether this was due to lack of available information or limitations of the search terms. An internet search was conducted using the search string (new technology clinical practice) AND (committee OR guide OR policy OR procedure) in the Google Advanced Search function.

Critical appraisal relevant to the study design was planned, however no research studies were identified and the expert guidance documents ascertained contained no methods to allow critical appraisal.

Step 2: Develop a proposal for change

Best practice guide

The principles identified in the literature and local needs assessment were collated and tabulated into a Best Practice Guide. Program components were developed through stakeholder consultation and feedback.

Likelihood of success and sustainability

A checklist developed for previous CCE projects was used to assess the likelihood of success and sustainability of the proposed changes. The characteristics of success were derived from the work of Grol and Grimshaw; Grol, Wensing and Eccles; Greenhalgh et al. [911] and the sustainability factors were adapted from a capacity-building framework [12].

Step 3: Implement the change

Barriers and enablers

Barriers and enablers to the proposed changes were identified by decision-makers and administrators in group and individual discussions, and by applicants in individual discussions, feedback forms at the end of all documents and email invitations to provide input. The project team used the checklist for success and sustainability and the classification of barriers and enablers by Grol and Wensing [13] as prompts to identify additional factors. Barriers and enablers in the context of organisational decision-making were also sought from the literature.

Step 4: Evaluate the extent and results of the change


A formal evaluation plan was developed. Full details including the evaluation questions for each component, indicators, methods, sources and timing of data collection, and reporting schedule are available in the toolkit [Additional file 1].

Current practice was mapped against the identified principles in the Best Practice Guide to provide baseline data. Planned evaluations were undertaken at 12 and 24 months [Additional file 1].

Ongoing quality improvement

Following the initial evaluation period, the TCPC and Secretariat continued to collect, analyse and act on feedback as a quality improvement activity.


The Monash Health Human Research and Ethics Committee approved this project as a Quality Assurance activity (Application Number 09195Q).


Step 1: Identify the need for change

Needs assessment

Twenty-five needs were identified (Table 1). These confirmed the limitations of the existing system and provided opportunities and methods for improvement. Decision-makers noted issues related to meetings held at short notice affecting their ability to attend and adequately appraise materials provided, lack of resources to administer the process and insufficient information on which to base decisions. Applicants reported difficulty accessing information about the process and frustration at being asked to submit applications to multiple committees. Inadequate governance and reporting structures and a need for mechanisms to deal with change of use of TCPs were also noted.

Table 1 Needs assessment

Literature review

Five relevant publications from national and state bodies were identified [1, 1417]. The four government agencies and a professional association with expertise in HTA were assessed as appropriate sources for this type of information and the documents were considered to be expert guidance.

The publications identified standards, rules, criteria or principles that they recommended for TCP programs within hospitals or health care organisations. For the purposes of this paper, these are referred to collectively as ‘principles’ for good practice in introduction of TCPs.

Twenty-seven principles were extracted. There was considerable variation in content between the documents with only six principles common to all five publications (Additional file 2).

Need for change

The 25 local needs were reframed as principles; four of these had also been identified in the 27 from the literature [1, 1417], making a total of 48 principles after removal of duplication.

There was a discrepancy in findings between the two sources. Monash Health staff identified five principles related to the need for adequate resources to deliver the program and support applicants in finding and using evidence from research and local data, preparing patient information and collecting and reporting outcomes. These were not identified in the literature.

Monash Health met only 14 out of 48 principles, establishing a clear need for improvement. The 34 unmet principles for good practice indicated the areas to be addressed (Additional file 2).

Step 2: Develop a proposal for change

Best practice guide

‘Best practice’ was defined as implementation of all the principles. Monash Health sought to establish the new program based on this guide to best practice (Additional file 2).

Program components

Principles in the Best Practice Guide were discussed with the stakeholders and drafted into the seven components that would form the new TCP Program: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aim of transparent, accountable and evidence-based decision-making was made explicit in each of the components. Details of how the principles within each component were operationalised are outlined in the toolkit and copies of all documents and resources are provided [Additional file 1].

Program logic

A detailed program logic model was constructed incorporating the key factors that required improvement, deliverables identified from the Best Practice Guide, intended outcomes and indicators. A summarised version is presented in Fig. 2.

Fig. 2
figure 2

Program logic model for Technology/Clinical Practice Program

Likelihood of success and sustainability

The proposed program components and logic model were found to meet all the requirements for sustainability; however assessment of likelihood of success identified several potential barriers and enablers (Table 2).

Table 2 Assessment of success and sustainability

Step 3: Implement the change

Barriers and enablers

Due to the iterative nature of the change process, barriers and enablers were identified prior to, during and subsequent to the implementation phase through the action research reflection and the evaluation activities.

Some factors were not explicitly recorded during the project but were acknowledged implicitly when strategies were developed and implemented to address them. They have been included for completeness and to assist others in replicating this program. Forty-five barriers and ten enablers are reported (Additional file 3).

Barriers were identified in each category and were applicable to decision-makers, administrators and applicants. Key themes are summarised below.

  • Economic and political context (n = 3): effect of state and national activities related to TCPs

  • Organisational context (n = 10): lack of time and resources, lack of awareness of current decision-making structures

  • Social context (n = 5): lack of influence, effect of perceptions and power relationships

  • Patient (n = 4): need for adequate consumer representation in decision-making, limitations in quality of patient information brochures

  • Individual professional (n = 17): lack of awareness, lack of knowledge and skills, poor compliance

  • Innovation (n = 6): complexity, time requirement, perceived lack of advantage

Enablers were related to the organisational commitment to the new program. The project was a high priority and the pursuit of excellence was made explicit. Funding and resources were provided and the Board, Executive, Senior Managers and Clinical Directors were all supportive.

Implementation strategies

Implementation strategies were developed to overcome or minimise barriers and build on enablers. Individual strategies are detailed against their corresponding barrier or enabler in Additional file 3. As additional barriers were identified during the course of the project strategies were developed and implemented to deal with them.

The strategies fall into four main groups: changes to the structure of the TCP program, changes to the processes within it, provision of resources and support, and activities to communicate and disseminate information.

The first three groups could also be summarised as ‘make it mandatory, transparent and explicit’, ‘make it as easy as possible to do the right thing and hard to do the wrong thing’ and ‘provide as much help as the organisation can sustain’. These points applied equally to activities of decision-makers, administrators and applicants.

Changes to the structure and processes were implemented, resources were developed and the communication and dissemination strategies carried out. These have all been integrated into the toolkit so that it reflects best practice not only from the literature but also from the extensive learning from this project [Additional file 1].


The components were piloted during the Department of Health annual funding round for high cost TCPs. Input from decision-makers, administrators and applicants was obtained through invitations to provide phone or email feedback, feedback forms appended to all documents and a meeting held specifically for this purpose. Revisions to the documents and processes were made based on this feedback.

Program introduction

The program was implemented in full.

In addition to the specific pilot outlined above all documents were subsequently implemented in ‘pilot’ mode with a feedback section for notes at the end of the documents and an invitation to contact the Secretariat personally with any additional feedback. Further refinements were based on this input and stakeholders were informed. This communicated to applicants that the processes were not rigid, that their feedback was welcome, that it would be acted upon, and that it resulted in improvements.

Step 4: Evaluate the extent and results of the change


Detailed evaluation reports at 12 and 24 months were published on the TCPC website [Additional file 1].

At the commencement of the project, Monash Health initially met only 14 of the 48 principles. At the end of the evaluation period this was repeated and found that all 48 principles had been met (Additional file 2).

When opportunities for improvement were identified by the formal evaluation activities or through the ongoing feedback and action research processes, modifications were implemented to address them. As a result, the program not only achieved all the baseline principles but identified and implemented 51 additional principles across all seven components that were not in the original Best Practice Guide (Additional file 2).

In addition to the formal evaluation questions, other unexpected outcomes indicated the success of the program.

  • Recommendations from the Department of Health to other Victorian health services to use Monash Health methods and resources

  • Requests from several Victorian and interstate health services for permission to use Monash Health documents

  • Request from another state government to provide training in Monash Health methods for their state-wide decision-making body

  • Attainment of a national award: Australian Council of Healthcare Standards National Quality Improvement Award for Non-Clinical Service Delivery

  • Nomination for a state award: Victorian Public Healthcare Award for Doing It Better: providing sustainable, well managed and efficient health services

  • One specific element of the new program, the Joint Committee process, had a successful outcome that enabled an international breakthrough (Table 3: Case study) [18, 19]

    Table 3 Case study: the Joint Committee process

The overall conclusion was that Monash Health had met its objectives of achieving a transparent, accountable and evidence-based program for introduction of new TCPs and was consistent with world best practice.

Ongoing quality improvement

Although the formal evaluations found that all elements of the program had been successfully implemented, as the outcomes and other implications of newly introduced TCPs were observed over time some shortcomings in the program became apparent (Table 4).

Table 4 Opportunities for improvement identified in evaluation (Needs assessment for program revision)

Some of the opportunities for improvement were minor, such as changes to the membership of the TCPC. However after the first 14 application cycles it was clear that some of the original processes were not sufficient to achieve the desired level of rigour in decision-making. Two main issues were identified.

Firstly, the process was ‘applicant-driven’. Applicants were required to provide systematic reviews; as a first step they were to search for existing reviews, if none were available then they were to conduct their own. Lack of knowledge and skills in evidence synthesis had been identified as barriers and strategies to address these included a step-by-step guide to finding, appraising and synthesising research literature [20], templates in the application form to summarise the evidence appropriately and provision of advice from an expert systematic reviewer. A similar approach was advocated for collection and synthesis of local data with prompts for what was required and support from experts in coding, data analysis and finance provided. However applicants did not always follow the instructions in the online guide, did not use the evidence summary tables correctly, in some cases not at all, and many did not follow instructions to consult the expert staff, or consulted but chose to report the information provided selectively. Feedback from applicants themselves and observations of the TCPC and project team confirmed that the resources did not overcome the barriers. An additional challenge is the subjectivity inherent in an ‘applicant-driven’ system where the evidence to inform the decision is provided by those proposing the change.

Secondly, the application form based on the Department of Health requirements did not compare the new TCP with current practice on a ‘head-to-head’ basis. The costs and service utilisation data were gathered and reported in different ways which precluded definitive conclusions based on direct comparisons.

As a result of these two factors, the information provided often had omissions or errors and a tendency to overestimate positive outcomes and underestimate costs.

Program revision

All stakeholder groups were in agreement that there were problems in providing accurate information for decision-making. The TCPC became aware of models of health service policy committees in Canada and New Zealand that did not rely on applicants to provide information but utilised independent experts within the organisation to investigate the evidence from research and local data and develop a business case for new TCPs. These models demonstrated improved decision-making [Personal communication: Caroline McAleese, Auckland District Health Board] and resulted in considerable cost saving to the organisation [21]. Based on these findings, the TCPC decided to revise the Monash Health program.

Step 1 (revision): Identify the need for change

Needs assessment

The opportunities for improvement identified in Step 4 (Table 4) became the needs assessment for Step 1 of the revision process (Fig. 1). Ten new needs were reported. These reflected the ongoing problems with inadequate and inaccurate information to underpin decisions and the resources required to enable this.

Literature review

The previous review process was repeated. The search was augmented with review of reference lists of included publications and website searches of relevant agencies known to the project team.

Sixteen relevant publications from government agencies, professional bodies and health services in Australia, New Zealand, United Kingdom and Canada were identified [1, 1417, 2232]. These were considered to be expert guidance from appropriately qualified organisations, no research studies were found.

One hundred and nine principles for a TCP Program were extracted. The additional 82 principles reflect the increased number of publications available and the greater level of detail they recommend.

All 16 publications recommended that a TCPC is established, evidence of safety and effectiveness is robust and reliable, appropriate clinical and physical infrastructure and credentialed and trained staff are in place to support the introduction of new TCPs.

Twenty-six principles were cited by at least two thirds (10 or more) of the publications. These focused on overarching issues relating to governance, use of evidence in decision-making, and application and monitoring processes.

Fifty-eight principles were cited by less than one third (5 or less) of the guidance documents. These fell into two groups: those that specify more detail in the governance, decision-making and application processes and those that address aspects of reporting, administration, provision of resources and support, and evaluation of the program. Only five publications referred to consumer representation and only one suggested reporting outcomes to local consumer health councils or networks [14]. Only one source recommended repeat assessment of a newly introduced TCP at the end of a predefined period to determine whether it could be considered standard practice and monitoring could be ceased [31].

Need for change

The 10 local needs were reframed as principles; seven of which were duplicated in the literature. When added to the 109 published principles these brought the total to 112.

The initial literature review did not identify any recommendations for provision of resources, although Monash Health staff had considered this important. The principles related to resource provision from the first needs analysis were now included in the literature, however none of the documents noted principles identified by Monash Health staff for adequate allocation of staff with appropriate skills to manage and support the decision-making process or the need for evaluation and improvement of the systems and processes for introducing new TCPs.

Monash Health met 89 out of 112 principles. Most of the unmet principles related to the need for objective assessments undertaken by independent experts and direct ‘head to head’ comparisons of the TCPs under consideration.

Step 2 (revision): Develop a proposal for change

Best practice guide

The Best Practice Guide was revised to include the additional principles and categorise them into the seven program components. In addition to those above, Monash Health staff had identified and implemented many other principles during the implementation phase, 11 of which were not found in the literature. The recommendation for an Expression of Interest to replace the current application process was not included as it was considered to be a local solution rather than a universal principle for best practice. The final total was 122 principles for good practice (Additional file 2).

Amendments to program components

The unmet principles were considered and amendments to the program were proposed. Membership of the TCPC was changed to increase the seniority, credibility and influence of the committee to make and implement appropriate and acceptable decisions. The other amendments focused on improving the quality of information provided to decision-makers, predominantly through the application process.

Revised application process

In the new model the previous lack of objectivity, time and skills is addressed by providing resources so that independent experts can undertake the work. To minimise unnecessary resource use, and in contrast to the Canadian and New Zealand models, the information is requested in stages in the Monash Health program, each stage predicated on a positive decision at the stage before (Fig. 3).

Fig. 3
figure 3

Revised application process for introduction of new TCP

Applicants submit an Expression of Interest in a much briefer document than the previous application form which greatly reduces their time commitment [Additional file 1]. The TCPC assesses whether the potential benefits of the new TCP and its fit within the organisation’s goals and priorities is enough to warrant using additional resources to explore it further. If so, the TCPC commissions a Systematic Review of the research evidence by the Centre for Clinical Effectiveness. If there is sufficient evidence of increased safety, effectiveness and/or cost-effectiveness to proceed the TCPC commissions a Business Case. The new Business Case process will address the inadequacies of the previous application form by providing direct comparisons of costs and health service utilisation. This assesses organisational capacity, capability, costs and resource implications and is undertaken by staff with expertise in these areas.

Considering the problems inherent in an ‘applicant-driven’ model and the successes of the international models using independent experts, Monash Health anticipates that use of organisational resources to provide better information to underpin decisions will be cost-effective.

This process is currently being piloted and refined.



A systematic approach was used to ascertain evidence to underpin the new TCP program; however no research was identified for organisational decision-making and clinical governance. Although the documents available were from credible organisations and considered to be ‘expert advice’, no quality appraisal could be undertaken to validate the recommendations. This is not likely to have a significant impact on the process or outcomes of this initiative as the recommendations reflect good practice principles that have been well established elsewhere eg transparent, evidence-based, consumer participation, etc.

What worked?

The aims of transparency and accountability in decision-making were achieved. The process and requirements are explicit, decisions are published on the internet, outcomes are recorded and activities are reported internally and externally.

Almost all components of the TCP program were implemented effectively. A range of factors are likely to be responsible for this success.

  • Use of an evidence-based approach to change guided by information from the literature, local experts and consumers

  • Multidisciplinary stakeholder involvement, including consumers, in development, implementation and evaluation of the project and representation in the systems and processes of the ongoing program

  • Assessment of barriers and enablers, characteristics of success and factors for sustainability followed by tailoring of strategies to maximise the benefits and minimise the problems identified

  • Implementation in a long-term ‘piloting’ mode that captured and acted upon user feedback for continuous improvement

  • Credibility of a program underpinned by international best practice

  • Provision of sufficient resources to undertake the project and deliver the program

  • Commitment, support and leadership from the Board, Executive, Senior Management and Clinical Directors

  • Skills of the CCE team in Evidence-Based Practice, knowledge brokerage and implementation of change

The aim of sustainability was also met, at least for the foreseeable future while Monash Health ensures the relevant skills and resources are available to provide high-quality information for decision-making.

It is not uncommon in health services for new initiatives to fail if contextual factors are changed. At the end of the establishment phase the CCE project team handed over to the Medical Governance Office as this was thought to be more suitable and sustainable for ongoing administration of the committee. A second handover to the Director of Medical Services was undertaken and then CCE took over the role again at the beginning of the revision phase. All three handovers went smoothly. The systems, processes, documents and resources proved to be readily transferrable and the program ran seamlessly throughout the moves between departments.

Monash Health demonstrated effective leadership in this area. The high standards achieved by this initiative were acknowledged through a national award and multiple requests to assist decision-makers in other contexts with translation to their settings. Many health services have yet to address the issues related to introduction of new TCPs and it is still common for others to rely on processes that are not evidence-based, transparent or accountable [33].

What didn’t work?

Minor problems were amended as they arose; these are captured in the needs assessments and barrier analyses and the strategies to address them are summarised in the toolkit. The main areas of concern related to non-compliance with the application requirements which resulted in incorrect or inadequate information being provided for decision-making. This is being addressed in the revision phase.

Implications for policy and practice

The Australian government has called for reform of post-market surveillance of health technologies to strengthen patient safety and value for money for taxpayers and an international policy forum has proposed that a minimum dataset be developed to focus monitoring activities [34, 35]. This project has demonstrated that it is possible to assess newly introduced TCPs to determine whether practice should continue unchanged, be modified or withdrawn based on locally-collected data. If collection methods are standardised, these data could be pooled at state, national and international level to provide detailed post-market information.

Introduction of mechanisms for prioritisation would improve the program [36]. The current process aims to ensure that a proposed new TCP is safe, effective and can be delivered within organisational capacity and capability. There is no systematic consideration of available alternatives or whether a proposed TCP, even if safe and effective, should be introduced at all. Resources could be saved or redirected to something that has greater impact, is more consistent with organisational priorities or has other benefits.

The structure of the program could be improved by introducing an eighth component for ‘Implementation’. The current Application and Monitoring components include principles that focus on the safety of implementation of a new TCP. Having a specific component would not only capture these but could also include principles that focus on, and highlight the importance of, an evidence-based approach to the implementation process.

The current system is reactive; it responds to individual applications which are driven by non-systematic factors such as clinician’s interests and exposure to promotion of new TCPs. This could be improved through a systematic proactive approach where the organisation seeks out information on new TCPs that are already proven to be safer, more effective or most cost-effective than current practice and considers their fit with organisational objectives and the opportunity costs and risks incurred if they are introduced or not.

There is considerable waste of resources when each health service replicates the information gathering steps to make the same decisions. Sharing of information could reduce this duplication. Monash Health publishes its Decision Summaries on the internet but provision of a central website to house this information at a state or national level might encourage similar publications by others and facilitate access and utilisation for decision-making.

The initial program model was sustainable but proved inadequate to address the aim of robust EBDM. The revised model is likely to meet the aim but, due to the resources required to deliver the evidence, may not be sustainable. Local decisions need to consider local factors however a systematic review of the literature should not be duplicated in each health service. Methods to encourage and facilitate publication of systematic reviews conducted for local decision-making could be explored.

Implications for research

The lack of research into decision-making processes for introduction of new technologies at the local level and the limitations of ‘knowledge purveyors’ in this context have been noted [33, 37]. This project highlights issues with the quality of information provided by applicants. Lack of knowledge and skills in evidence synthesis were identified as barriers at the outset and are consistent with the findings of others [38]. Resources and tools were developed and expert advice was provided however these initiatives were insufficient to enable applicants to provide trustworthy information. An education program in systematic review methods was not feasible as the potential target audience was too big, any staff member could submit an application. There are systematic reviews on effectiveness of interventions to increase use of research in decision-making [39], education programs for Evidence Based Practice [40] and critical appraisal [41], printed education materials for practice change [42], electronic retrieval of information by health professionals [20] and tailored interventions to overcome barriers [43] but we were unable to find anything on the effectiveness of resources to guide or support clinicians and managers to undertake systematic reviews and health technology assessments for local decision-making. Further research in this area would support efforts to increase the quality of information provided for evidence-based decisions.

Although it is recommended in the Best Practice Guide (Additional file 2) and the potential benefit is acknowledged, most health services do not employ a health economist [44]. However it might be possible for health service staff to incorporate health economic principles in their decisions through application of algorithms or other resources. Research in development and evaluation of tools, templates and guidance materials would facilitate use of health economic methods in local decision-making in the absence of a health economist.


The Technology/Clinical Practice Program was established using an evidence-based approach to development, implementation and evaluation. The program components were based on a review of the literature, consultation with experts and stakeholders, assessment of characteristics of successful change models and factors for sustainability, identification of barriers and enablers to introduction of best practice and experience from implementation and evaluation in a large health service network.

A toolkit containing details of the processes and resources for implementation is provided to facilitate replication or local adaptation by those wishing to establish a similar program. The components are likely to be generalisable to most health care organisations and, with the exception of the systematic review process, would be achievable with minimal additional resources.

Expertise for effective HTA is not available in most health services, but even if it were, duplication of systematic reviews for the same TCP would be a waste of very limited resources. Some duplication is required for assessment of local factors such as capacity, capability and access but methods to share information that is applicable to all need to be explored.

Further research is required into EBDM for resource allocation at local level.



Centre for Clinical Effectiveness


Evidence based decision making


Health technology assessment


Medical Services Advisory Committee


Pharmaceutical Benefits Advisory Committee


Technology or clinical practice


Technology/Clinical Practice Committee


  1. Department of Human Services. Guidance for Victorian public health services to establish Technology/Clinical Practice Committees. Melbourne: Victorian Government; 2006.

    Google Scholar 

  2. Gallego G. Hospital drug and therapeutics committees in Australia: is there a role for economic evaluation at the institutional level? Int J Clin Pharm. 2011;33(6):895–7. doi:10.1007/s11096-011-9565-3.

    Article  PubMed  Google Scholar 

  3. Lourenco T, Grant A, Burr J, Vale L. Local decision-makers views’ of national guidance on interventional procedures in the UK. J Health Serv Res Policy. 2010;15 Suppl 2:3–11. doi:10.1258/jhsrp.2009.009090.

    Article  PubMed  Google Scholar 

  4. Schumacher I, Zechmeister I. Assessing the impact of health technology assessment on the austrian healthcare system. Int J Technol Assess Health Care. 2013;29(1):84–91. doi:10.1017/s0266462312000748.

    Article  PubMed  Google Scholar 

  5. Robinson JS, Turnbull DA. Changing healthcare organisations to change clinical performance. Med J Aust. 2004;180(6 Suppl):S61–2.

    PubMed  Google Scholar 

  6. Harris C, Turner T, Wilkinson F. SEAchange: guide to a pragmatic evidence-based approach to sustainable, effective and appropriate change in health services. 2015. Available from: Accessed: December 2015.

    Google Scholar 

  7. Meyer J. Qualitative research in health care. Using qualitative methods in health related action research. BMJ. 2000;320(7228):178–81.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Meyer J. Evaluating action research. Age Ageing. 2000;29 Suppl 2:8–10.

    Article  PubMed  Google Scholar 

  9. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362(9391):1225–30.

    Article  PubMed  Google Scholar 

  10. Grol R, Wensing M, Eccles M. Improving practice: a systematic approach to implementation of change in patient care. Oxford: Elsevier Science; 2004.

    Google Scholar 

  11. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. doi:10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Health NSW. A framework for building capacity to improve health. Sydney: NSW Health Department; 2001.

    Google Scholar 

  13. Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Med J Aust. 2004;180(6 Suppl):S57–60.

    PubMed  Google Scholar 

  14. Quality and Safety Branch NSW Health, Branch HSPaI. Clinical practice - model policy for safe introduction of new interventional procedures. 2005. Available from: Accessed: May 2015.

    Google Scholar 

  15. National Health Committee. Decision-making about new health interventions: a report to the New Zealand Minister of Health. 2006. Available from: Accessed: December 2015.

    Google Scholar 

  16. Australian Capital Territory Health. Policy: introduction of new health technologies. Canberra: ACT Health; 2007.

    Google Scholar 

  17. Australian Safety and Efficacy Register of New Interventional Procedures (ASERNIP-S). General guidelines for assessing, approving and introducing new procedures into a hospital or health service. Royal Australian College of Surgeons. Available from: Accessed: December 2015.

  18. Veldman A, Santamaria-Araujo JA, Sollazzo S, Pitt J, Gianello R, Yaplito-Lee J, et al. Successful treatment of molybdenum cofactor deficiency type A with cPMP. Pediatrics. 2010;125(5):e1249–54. doi:10.1542/peds.2009-2192.

    Article  PubMed  Google Scholar 

  19. Schwahn BC, Van Spronsen FJ, Belaidi AA, Bowhay S, Christodoulou J, Derks TG, et al. Efficacy and safety of cyclic pyranopterin monophosphate substitution in severe molybdenum cofactor deficiency type A: a prospective cohort study. Lancet. 2015. doi:10.1016/s0140-6736(15)00124-5.

  20. Harris C, Garrubba M. Finding the evidence: guide to the best available evidence to support introduction of new technologies and clinical practices. Centre for Clinical Effectiveness, Southern Health. 2011. Available from: Accessed: December 2015.

    Google Scholar 

  21. McGregor M, Brophy JM. End-user involvement in health technology assessment (HTA) development: a way to increase impact. Int J Technol Assess Health Care. 2005;21(2):263–7.

    PubMed  Google Scholar 

  22. Victorian Policy Advisory Committee on Technology. Victorian Policy Advisory Committee on Technology (VPACT). Terms of reference for role, function and governance. Department of Human Services, Victorian Government. 2009. Available from: Accessed: December 2015.

    Google Scholar 

  23. Sydney South West Area Health Service. New interventional procedures: the safe introduction of new interventional procedures into clinical practice- policy directive. NSW health. 2007.

    Google Scholar 

  24. Ambulance Service of New South Wales. The safe introduction of new clinical interventions into ambulance practice-standard operating policy. Ambulance service of New South Wales. 2009. Available from: Accessed: December 2015.

    Google Scholar 

  25. Department of Health Government of Western Australia. Policy for the introduction of high-cost new health technologies into the Western Australian public health system. 2005. Available from: Accessed: May 2015.

    Google Scholar 

  26. Australian Health Ministers’ Advisory Council (AHMAC). Nationally Funded Centres guidance for governance, management, funding, establishment, review. 2010. Available from: Accessed: December 2015.

    Google Scholar 

  27. Glew P. Policy for the introduction of new clinical procedure/intervention or technique. Plymouth: Plymouth Hospitals NHS Trust; 2008.

    Google Scholar 

  28. Taft K, Halton T. New interventional procedures policy directive. Milton Keynes: Milton Keynes Hospital NHS Foundation Trust; 2009.

    Google Scholar 

  29. Medical Advisory Secretariat Ontario. Decisions determinants guidance document:The OHTAC decision-making process for the development of evidence-based recommendations. Ontario Ministry of Health and Long-Term Care. 2010. Available from: Accessed: December 2015.

    Google Scholar 

  30. Calgary Health Research Portfolio Alberta Health Services. Local health technology decision support program: 2009 revision. Department of Surgery and Surgical Services University of Calgary. 2009. Available from: Accessed: December 2015.

    Google Scholar 

  31. National Health Committee New Zealand. District Health Board decision-making about new health interventions: A background paper. The National Advisory Committee on Health and Disability. 2006. Available from: Accessed: December 2015.

    Google Scholar 

  32. Queensland Policy and Advisory Committee for New Technology. Queensland Policy and Advisory Committee on New Technology (QPACT) decision-making framework for health technology investment in Queensland Health: A guidance document. Queensland Health. Available from: Accessed: December 2015.

  33. Stafinski T, Menon D, McCabe C, Philippon DJ. To fund or not to fund: development of a decision-making framework for the coverage of new health technologies. Pharmacoeconomics. 2011;29(9):771–80. doi:10.2165/11539840-000000000-00000.

    Article  PubMed  Google Scholar 

  34. Department of Health and Ageing. Review of health technology assessment in Australia. Commonwealth of Australia. 2009. Available from:$File/hta-review-report.pdf. Accessed: December 2015.

    Google Scholar 

  35. Fronsdal KB, Facey K, Klemp M, Norderhaug IN, Morland B, Rottingen JA. Health technology assessment to optimize health technology utilization: using implementation initiatives and monitoring processes. Int J Technol Assess Health Care. 2010;26(3):309–16. doi:10.1017/s0266462310000309.

    Article  PubMed  Google Scholar 

  36. Mitton C, Donaldson C. Setting priorities in Canadian regional health authorities: a survey of key decision makers. Health Policy (Amsterdam, Netherlands). 2002;60(1):39–58.

    Article  Google Scholar 

  37. Black N. Evidence based policy: proceed with care. BMJ. 2001;323(7307):275–9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  38. Evans BA, Snooks H, Howson H, Davies M. How hard can it be to include research evidence and evaluation in local health policy implementation? Results from a mixed methods study. Implement Sci. 2013;8:17. doi:10.1186/1748-5908-8-17.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Moore G, Redman S, Haines M, Todd A. What works to increase the use of research in population health policy and programmes: a review. Evid Policy. 2011;7(3):277–305. doi:10.1332/174426411X579199.

    Article  Google Scholar 

  40. Dorsch JL, Perry GJ. Evidence-based medicine at the intersection of research interests between academic health sciences librarians and medical educators: a review of the literature. J Med Libr Assoc. 2012;100(4):251–7. doi:10.3163/1536-5050.100.4.006.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Horsley T, Hyde C, Santesso N, Parkes J, Milne R, Stewart R. Teaching critical appraisal skills in healthcare settings. Cochrane Database Syst Rev (Online). 2011;11:CD001270. doi:10.1002/14651858.CD001270.pub2.

    Google Scholar 

  42. Giguere A, Legare F, Grimshaw J, Turcotte S, Fiander M, Grudniewicz A, et al. Printed educational materials: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev (Online). 2012;10:CD004398. doi:10.1002/14651858.CD004398.pub3.

    Google Scholar 

  43. Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev (Online). 2010;3:CD005470. doi:10.1002/14651858.CD005470.pub2.

    Google Scholar 

  44. Eddama O, Coast J. A systematic review of the use of economic evaluation in local decision-making. Health Policy (Amsterdam, Netherlands). 2008;86(2–3):129–41. doi:10.1016/j.healthpol.2007.11.010.

    Article  Google Scholar 

Download references


The authors would like to acknowledge the contribution of others. Monash Health provided funding and supporting infrastructure for this project. No conditions related to the project or this publication were imposed. Members of the Technology/Clinical Practice Committee for direction, oversight and guidance. CCE staff members not on the project team who provided help and support. Monash Health staff who gave their time generously to share their thoughts and experiences. Professor Sally Green, Professorial Fellow, School of Public Health and Preventive Medicine, Monash University for review and critical comments on the manuscript.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Claire Harris.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

RK, CK, MT and BC were members of the original Technology/Clinical Practice Committee that identified the need for change. CH and MG produced the initial best practice guide and developed, implemented and evaluated the TCP program. CH, MG, KA and DF revised the best practice guide. CH and KA developed and implemented the revised TCP program. RK, CK, MT, BC and WR provided direction and feedback during both projects. CH wrote the initial draft; MG and KA provided critical revisions; RK, CK, MT, BC and WR provided feedback on later drafts. All authors read and approved the final manuscript.

Authors’ information

CH was the Director of the Centre for Clinical Effectiveness and TCPC Executive Officer. MG was Senior Project Officer at CCE and TCPC Administrative Officer (Development phase). KA was Consultant at CCE and TCPC Administrative Officer (Revision phase). RK is Director of Medicine Program and TCPC Chair. CK was Director of Medical Services, MT is Legal Counsel for Research and Ethics, BC was Consumer Representative and were all members of TCPC. WR was Executive Director of Medical Services and Executive Sponsor of TCPC. DF was Project Officer at CCE.

Additional files

Additional file 1:

Harris C, Garrubba M and Allen K. Introduction of new health technologies and clinical practices: Toolkit for a transparent, accountable, evidence-based program for hospitals and health care organisations. 2014. Technology/Clinical Practice Program. Monash Health, Melbourne, Australia. (PDF 2131 kb)

Additional file 2:

Best Practice Guide. (PDF 423 kb)

Additional file 3:

Barriers and enablers and strategies to address them. (PDF 424 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Harris, C., Garrubba, M., Allen, K. et al. Development, implementation and evaluation of an evidence-based program for introduction of new health technologies and clinical practices in a local healthcare setting. BMC Health Serv Res 15, 575 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: