Skip to main content

Relational aspects of building capacity in economic evaluation in an Australian Primary Health Network using an embedded researcher approach

Abstract

Background

Health organisations are increasingly implementing ‘embedded researcher’ models to translate research into practice. This paper examines the impact of an embedded researcher model known as the embedded Economist (eE) Program that was implemented in an Australian Primary Health Network (PHN) located in regional New South Wales, Australia. The site, participants, program aims and design are described. Insights into the facilitators, challenges and barriers to the integration of economic evaluation perspectives into the work of the PHN are provided.

Methods

The eE Program consisted of embedding a lead health economist on site, supported by offsite economists, part-time, for fifteen weeks to collaborate with PHN staff. Evaluation of the eE at the PHN included qualitative data collection via semi-structured interviews (N= 34), observations (N=8) and a field diary kept by the embedded economists. A thematic analysis was undertaken through the triangulation of this data.

Results

The eE Program successfully met its aims of increasing PHN staff awareness of the value of economic evaluation principles in decision-making and their capacity to access and apply these principles. There was also evidence that the program resulted in PHN staff applying economic evaluations when commissioning service providers. Evaluation of the eE identified two key facilitators for achieving these results. First, a highly receptive organisational context characterised by a work ethic, and site processes and procedures that were dedicated to improvement. Second was the development of trusted relationships between the embedded economist and PHN staff that was enabled through: the commitment of the economist to bi-directional learning; facilitating access to economic tools and techniques; personality traits (likeable and enthusiastic); and because the eE provided ongoing support for PHN projects beyond the fifteen-week embedding period.

Conclusions

This study provides the first detailed case description of an embedded health economics program. The results demonstrate how the process, context and relational factors of engaging and embedding the support of a health economist works and why. The findings reinforce international evidence in this area and are of practical utility to the future deployment of such programs.

Peer Review reports

Introduction

Decisions are made daily in healthcare about the type of care that will be provided, not only to individual patients but also to improve people’s overall health and wellbeing. These decisions can be about new medicines, new technologies, improved models of care or approaches designed to promote health and prevent or manage avoidable illnesses within the population [1].

Determining whether healthcare spending choices represent value - for governments, patients and taxpayers - depends on whose perspective is taken, but in general, ‘value’ can be assessed by examining the impacts of spending choices on health service efficiency and equity. While such evaluation is common at the national level, economic evaluations are relatively rare at the local level [2]. The local level includes local health districts, hospitals, community and primary care services and it is where Australia spends most of its health budget. There are a multitude of reasons why economic evaluations are not done at the local level, but key ones are a lack of health economic skills in the health workforce and/or a lack of access to appropriately experienced health economists [2].

The gap in health economic skills in local health services was an important finding in a national report on local level evaluation [2]. Insights from this work showed local health services wanted to develop their own internal capacity and capability in evaluation, particularly in health economics [2]. They also wanted to work with experts who would be focused on the priorities of the health service and share their skills with staff, a focus that was often lacking when private sector or academic expertise was engaged [2]. The embedded Economist (eE) Program was developed to address this need by embedding a health economist in health services to build capacity in economic evaluation. The embedding was undertaken over approximately three months each in four local health districts (LHDs) and also in a Primary Health Network (PHN). This paper reports the evaluation of the eE in the Hunter New England Central Coast Primary Health Network (HNECCPHN).

The paper firstly presents the distinct PHN context and the background that informed the ‘economist-in-residence’ approach. Following sections present methods, findings and impact of the eE Program, before distilling our study results into a more general argument in support of developing local capacity and capability in economic evaluation.

Background

Australian Primary Health Networks

Australian PHNs were established on 1st July 2015 by the Australian Federal Government to increase the effectiveness of medical services for patients, particularly those at risk of poor health outcomes, and to improve coordination of care. The PHNs are not-for-profit organisations which do not directly provide health services. Instead, they use a commissioning model to work with primary health care providers, secondary care providers and hospitals. Commissioning includes a range of activities to assess the needs of the population, plan and prioritise services, purchase those services and monitor the quality of the services being provided. PHN commissioning is intended to move the local health system towards more sustainable models of care by not only procuring new or additional services but also transforming and reorganising existing services [3,4,5].

As a ‘newcomer’ to the Australian health services landscape, there is little research overall on PHNs and the effectiveness of their commissioning [3,4,5]. Recent evaluations have concluded that they are ‘making progress’ in demonstrating a deeper understanding of the needs of their local communities, building partnerships to address shared priorities, and developing innovative ways to commission services [6]. However, what we know from international experience is that effective commissioning relies on evidence-informed planning and evaluation of cost against effectiveness, including valid and feasible measures of quality and/or outcomes [7, 8]. Where commissioning services have been extensively studied, such as in England, the decision-making process has been described as pragmatic, involving ‘juggling competing agendas, priorities, power relationships, demands and personal inclinations to build a persuasive, compelling case,’ necessitating a change in approach by researchers to one that ensures the provision of localised and useful information [7]. As a result, commissioning in England as an approach to improve and integrate care services has often been described as ‘weak’ due to the lack of advanced commissioning skills [9]. Attempts in England to address this skills gap have included workshops on systematic evidence searching and critical appraisal of evidence quality [10]. There has also been advocacy for the use of economic modelling tools [11], especially to improve procurement processes and contract management.

Embedded researchers

Embedded researchers, or researchers-in-residence, are researchers who work as part of an operational team. Their role is not simply to bring new skills and expertise to the team. They are charged with ‘mobilising knowledge and creating new evidence for local use and wider dissemination’ [12]. Embedded researchers are required to ‘negotiate their expertise, integrate it with the expertise of their colleagues’ and, where necessary, compromise to reach ‘shared understanding and solutions’ [12].

This approach has been implemented in areas including education, the law, and social care [13]. Researchers have embedded across a wide variety of clinical contexts in the United Kingdom [12, 14, 15]; including public, private and voluntary settings that either commission and/or deliver primary, community and/or acute care [12, 14, 15]. Similar initiatives have been implemented in the United States and Canada [16]. There is a small but growing number of embedded researcher initiatives in Australia, most of which have taken the form of a senior academic with clinical experience collaborating with health service staff on co-producing traditional research outputs [17, 18].

Regardless of jurisdiction or clinical setting, the literature on embedded researchers is rated by Ward and colleagues [15] as lacking in analysis ‘disaggregating the components’ of such initiatives, concentrating instead on ‘overviews of the principles of embedded research.’ The result is little information about what embedded researcher initiatives look like in practice [15]. There is agreement on the need for more research into how the models are implemented, how they work, under what conditions and for whom [17,18,19,20]. Varallyay et al. [21] regard this as ‘…particularly important given the lack of clarity about the core features and conditions of “embeddedness” minimally required to achieve the objective of evidence-informed decision-making for health programme improvement.’

Despite the lack of formal evaluations to date, guidance on designing embedded initiatives is not lacking. The literature identifies and lists a number of key success factors for embedded research including various process [13, 18, 19, 22,23,24,25], contextual [18, 19, 22, 23, 26] and relational factors [13, 19, 22, 25, 27].

The embedded Economist (eE) program

Against this background, this article now sets out results from embedding a health economist in the first of six program sites, a regional New South Wales (NSW) PHN. This site was unique – the others being large health services centred on hospitals. The full planned program protocol, including a brief literature review and overview has been reported in an earlier edition of this journal [1]. The conceptual underpinnings are also reported elsewhere in detail [28]. The basis of the project is ‘slow science’ [29] and focus achieving impact in the site of research, with academic outputs such as publication not an objective (but to be supported if desired by the site).

By way of summary, the eE Program had three aims:

  1. 1.

    To increase health service staff awareness of the value economic evaluation can bring to decision-making.

  2. 2.

    To develop health service staff knowledge and capacity to access and apply economic evaluation principles, methods and tools in decision-making through formal training and extended exposure to an embedded economist.

  3. 3.

    To facilitate health service practice change and the routine application of economic evaluation principles in decision-making [1].

The aims of the evaluation were to capture the outcomes and impact of embedding an economist and to evaluate the contextual, procedural and relational aspects that facilitated or acted as a barrier to the program [1]. More detail is provided on the relational findings as they add considerably to the published literature.

Methods

Setting

The Hunter New England Central Coast Primary Health Network (HNECCPHN) is a new (approximately six years old) and medium-sized organisation, consisting of a chief executive; four senior executives; 11 board members; a clinical council; a community council; and approximately 100 full-time equivalent staff. The HNECCPHN region covers 130,000 square kilometres, as set out in Fig. 1, and has a population of 1.2 million people who live in small rural and remote villages, in regional towns and in densely populated urban centres. The organization has three main offices in Erina, Newcastle and Tamworth, located hundreds of kilometres apart.

Fig. 1
figure 1

The HNECCPHN geographical footprint, Source: https://thephn.com.au/about-us

Participants

The chief executive, all senior executives (N=4) and all staff employees (N=100) had some form of interaction with the eE over the three program phases. A lead health economist and four support health economists from the Hunter Medical Research Institute (HMRI) also participated. Specific details including number and type of participants are set in Supplementary file 1.

Site study design

A senior ‘front of house’ lead health economist (professorial level) was embedded within the HNECCPHN from 15 October 2019 through to 29 February 2020. Allowing for Christmas and New Year breaks, the elapsed time was approximately fifteen weeks. The economist worked with PHN staff to conduct economic evaluations and advise on evaluation and impact design and the inclusion of economics principles, as appropriate, in day-to-day business. While the lead economist conducted all the face-to-face interactions with PHN staff, ‘back of house’ support economists were engaged from the HMRI Health Economics team to provide assistance in conducting the economic evaluations and to provide specialist expertise as needed. The program was structured into three phases: planning, embedding and post-embedding [1]. A detailed description of each phase, how they were implemented, and who participated in the HNECCPHN eE Program is set out in Supplementary file 1. Specific details about the number and type of engagement with the economist are set out in Supplementary file 2.

Evaluation design

The evaluation took place concurrently with the eE Program, with the last interviews conducted in mid-July 2020, approximately five months after the embedding phase was complete. Ethics approval was obtained from the Human Research Ethics Committee of the University of New England (approval number H-2018-0005), with written consent provided by all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Potential evaluation participants were all HMRI economists and PHN senior executives and staff members participating in the eE Program. Complementary data was collected at each phase of the project to capture insights into the context, processes, relational aspects and outcomes of the program [25]. Methodological triangulation was undertaken by cross-checking or comparing and contrasting the different data sources which add value to each other by illuminating different aspects of an issue and potentially explaining unexpected findings. For example, the reflective field diaries kept by the embedded economist provided data on what the embedded economists think and do [22, 25]. Interviews allowed issues identified by the evaluator to be explored in more detail with the economist, site executives and participants. Interviews were semi-structured and followed an interview guide based on the program aims and key themes from the embedded researcher literature. Observation notes enabled the social scientist to identify if what people said in interviews and field diaries matched what was done in real-time.

Data collection and data analysis

The data that were collected prior, during and after the economist embedded at the PHN are listed in Table 1.

Table 1 Data collected for evaluation

Interviews were recorded with consent and professionally transcribed. Participants were invited to review their de-identified transcripts and request amendments before they are entered into QSR International’s NVivo qualitative data analysis software (Version 12) [30]. Documents were provided by the sites. The researchers also produced field diaries and researcher observations. All data were entered into NVivo.

A thematic analysis was conducted on the data by applying a coding framework using predetermined themes. The themes were based on a review of the embedded literature conducted by the researchers, used to design the program (and summarised in the introduction of this paper) as well as the program aims. The coding framework was tested and refined by two of the researchers (DP and CJ), with each blind coding six interviews and refining the manual and coding based on a comparison of their coding of these initial interviews. The refined coding manual was then applied to the remaining transcripts via an iterative, reflexive and concurrent process of refining themes [31, 32]. The coded themes were corroborated and legitimated by scrutinising the previous stages to ensure that clustered themes were representative of the initial data analysis and assigned codes.

Results

Key outputs

The key outputs include six projects that were undertaken by the HNECCPHN whilst the economist was embedded. A variety of economic evaluation skills and tools were developed and applied during the projects including: business case development; applying cost modelling principles and cost consequence tools and processes; developing and implementing a program logic model; and identifying and collecting appropriate outcome measures to support program evaluation and impact assessment. Supplementary file 3 presents an overview of the projects.

Key impacts

Key impacts are summarised in Table 2. Interviews with all program participants suggests the program successfully increased staff awareness of the benefits of economic evaluation. Increased awareness took the form of looking at work differently and thinking about evaluation more, from a program or initiatives inception. Both senior executive (N=2) and staff participants (N=10) believed the eE Program had developed their capacity to access and apply economic evaluation. It did so by providing additional information, knowledge and tools. For example, a variety of economic evaluation skills and tools were developed and applied via the six projects including: developing a business case; applying cost modelling / cost consequence tools and processes; developing and implementing a program logic model; and identifying and collecting appropriate outcome measures to support program evaluation and impact assessment (see Supplementary file 3 for more detail on tools and skills development). In addition, interviews with the site lead and staff participants (N=3) suggest there was limited but emerging evidence to demonstrate practice change. However, senior executive (N=2) and staff participants (N=12) stated they wanted ongoing support if capacity building was to continue:

Ongoing support would be ideal…what we found was we – we were sorry that the time was coming to an end because we saw an ongoing need for the embedded Economist in providing further support and upskilling, and capability development, which I suppose reflects the success of the initiative in its early stages. PHN PARTICIPANT 4

Table 2 Overview of key impacts

Key findings

Tables 3, 4, and 5 set out the process, contextual and relational facilitators and barriers encountered for each phase of the program.

Table 3 Summary of planning phase process facilitators
Table 4 Summary of embedding phase process facilitators and barriers
Table 5 Summary of planning and embedding phase contextual facilitators and barriers

Process facilitators and barriers

Three process issues facilitated the planning phase. The first was Board and active executive sponsorship within the HNECCPHN. The second was the seniority of the lead economist and applied experience of all the support economists. More than 20 years of applied experience working with health services, and an understanding of the regulatory context in which the PHN operated were considered crucial to encourage executive teams to consider new ways of thinking and working. Thirdly, the eE Program was facilitated by the appointment of a site lead, a PHN senior executive who ‘concierged’ the program from within the organisation, liaising with the program manager and acting as a gateway for staff to access the economists.

Three process issues facilitated the embedding phase: the physical location of the economist in an open-plan office; a single contact point and administrative support to book formal meeting requests; and identification of existing in-house expertise.

Process challenges in the embedded phase included: depth of embedding (economist time available per week); the need for a communication and visibility strategy; the need for an exit strategy; and the overall length of embedding period. The economist embedded for 328.7 hours - approximately 22 hours per week. This partially embedded model was implemented due to the available project resources. It meant however that the economist was not fully relieved from their everyday jobs and was constrained by competing interests and demands. Despite this, the economist was available outside of their physically-embedded time by telephone and via virtual meetings.

The first two weeks of embedding resulted in less engagement than expected, with only 20 staff members engaging over this period. A more detailed communication strategy was then developed consisting of: three morning teas (one-face-to-face and two conducted via Skype with satellite offices) where the lead economist explained the program to managers; a ‘pitch’ emailed by the site lead to managers asking for project nominations to be ranked in order of organisational priority by executive; and a presentation by the lead economist at the annual ‘All Staff Day’ to increase program visibility and foster staff engagement. Given the success of the lead economist's presentation at the ‘All Staff Day’ and with the benefit of hindsight, the site lead thought earlier communication, engaging all staff, and clarifying the process for engaging with the economist to ensure a strategic approach to this limited resource, during the planning and embedding phases would have been beneficial. Staff also wanted improved communication throughout the embedded phase, including about what other projects the economist was working on. Overall, embedded research takes time to build visibility and momentum, timeframes may need to be adjusted to accommodate for this and staff need to be made aware that not everyone will have access, particularly when the economist was only intended to be placed at the PHN for a limited period.

Despite a slow start, a crescendo of projects presented toward the end of the embedded period, predominantly as a result of visiting a satellite site for the first time during the last week of embedding, resulting in ongoing remote work. The desirability of developing an exit strategy and moderate expectations about how much work can be done and when the work will end earlier in the program was identified.

The embedded component of the HNECCPHN project was scheduled for three months but actually went over time by approximately three weeks. Three months was perceived by all participants as the shortest possible timeframe, with six months cited as more realistic, as projects presented late into the embedded phase, right up to the last day of embedding. A longer lead time for the economist to immerse within the organisation and identify appropriate projects and priorities would have been beneficial.

These process issues are all interrelated as they represent consequences of limited economist time and eventual high demand for services. For instance, a slightly slow start may not have been identified as an issue with a different schedule or if the engagement was not perceived as very high value.

Contextual facilitators and barriers

A number of organisational characteristics impacted positively on the planning and embedding phases of the HNECCPHN program. The PHN’s medium size (approximately 100 full-time equivalent [FTE]) cultivated an enthusiastic and receptive culture where employees would easily see the benefit and impact of the eE Program. These factors made it feasible for the eE Program to achieve broad organisational reach.

Relevance of the eE Program to organisational function was another facilitator. Given the focus of the eE Program was on upskilling participants in economic evaluation, its relevance to commissioning was easily understood. Senior executives and staff recognised the lack of skills and capability in relation to economic evaluation and were motivated to take the opportunity the eE Program presented.

The eE Program therefore came along at a time that aligned with the PHN’s stage of organisational evolution. The organisation was moving from an establishment phase to attempting to commission services, support improvement in a range of services and demonstrate impact. It was hoped the eE Program would provide tools and training to inform more systematic or rigorous way of making decisions about planning and commissioning.

Staff’s limited time impacted on engaging with the economists, with some participants expressing regret at the end of the program that they had not had time to engage more.

Geography impacted negatively on the eE Program at HNECCPHN. Embedding at least partially face-to-face onsite was considered essential to the program by all participants, predominantly because of the opportunities this presented for quick, unplanned interactions and real-time feedback. There was the perception that staff in the Newcastle office benefited more from the project than staff in other offices because of the greater face-to-face contact Table 5.

Relational facilitators and barriers

The following three subsections set out the themes and sub-themes that emerged from data coded under ‘relationality and engagement.’ This code collected data that addressed: how and why engagement occurred or did not occur; and what relational processes, mechanisms, attributes and skills were required for engagement to occur.

Senior executives (N=3) and staff participants (N=12) expressed a number of positive attributes held by the economist that they perceived as facilitative of the program, as set out in Table 6. Economists (N=2) and PHN executives (N=2) and staff (N=16) provided insights on the ways of working.

Table 6 Summary of relational facilitators

These relational facilitators enabled the economist to provide tailored support and capacity building to facilitate the application of relevant tools and approaches to specific problems within the health service. Participants (N=12) in four of the six projects undertaken with the economist commented on this coaching approach. ‘Coaching’ was characterised by the lead economist as problem solving rather than imposing a solution.

Post embedding, the economist’s willingness to continue coaching three staff members, impacted positively on the program in that it enabled further capacity building to occur and ensured the trust built throughout the embedded phase was not broken by abandonment at the end of the program timeframe. During this phase ‘coaching’ was less intense. The eE’s field diary reveals ongoing sporadic email and Zoom contact about two projects over approximately two- and one-half months and email contact for approximately one year post-embedding.

This way of working was described by one economist as different from their usual way of work in that closer relationships were formed facilitating greater learning for both the economists and PHN participants. Coaching was underpinned by bi-directional knowledge exchange, with the economists learning as much as staff participants.

Discussion

Overall, our results confirm the commentary available on embedded research with similarities between our process, contextual and relational facilitators, challenges and barriers, to the high-level ones identified by previous researchers examining the implementation of embedded models in health care [12, 13, 18, 19, 22, 25]. Particular practical lessons that should be considered when implementing future embedded research are catalogued below.

Process factors

We confirmed the importance of a number of previously suggested process facilitators including the need to appoint site leads [22, 25]. Whilst our process for communication and engagement did result in securing executive engagement and the identification of a champion in the form of a trained economist, difficulty came in explaining how the economist would work and managing expectations about level of embeddedness and chiefly, how the eE would leave the site. Engagement should have occurred earlier and prior to embedding, in a longer designated planning phase [18, 19, 22]. The development of an operational plan helped with scope, but the volume of work in the available time and exit from the site were difficult to manage – especially considering the longer than expected lead time to co-produce the scope of work. We all had ‘little experience of the roles and therefore it took time and learning from all parties to embed in the role’ [24]. However, we advocate for the need for any planning phase to ensure program flexibility. Once health services identify the problems they wish to address, the program needs to be able to pivot to provide whatever skills are required. This might mean there is a need to find another researcher with a particular expertise, for example, statistical modelling.

Our program funding extended only to a part-time embedded researcher for three months. The time it took to gain momentum, the need for work to be conducted post-embedding and the economists’ struggles with competing work interests suggests more secure funding to allow flexibility in length and depth of embedding would have been preferable [18, 22, 24, 25].

Co-located workdays (including having a desk, access to IT systems, administrative support, communal areas and meeting rooms) enabled staff to seek informal support, which was greatly valued by staff in this particular location. Pain and colleagues suggest, ‘feelings produce impacts produce feelings’: emotional dimensions are not side-effects but are active in generating impact [27]. The warm relationships developed were central to program success. Establishing relationships requires a large investment at the front end of a program necessitating a ‘high level of physical presence and face-to-face interactions.’ [13]. Once established, these relationship may be maintained virtually, however some ongoing face to face contact is necessary [13]. Lack of co-location was challenging for the geographically dispersed PHN sites that did not have face-to-face access, confirming Vindrola-Padros’ (2019) research [13].

Contextual factors

The form, size and culture of the site all contributed positively to the program [18]. The PHN is a medium size organisation (approx. 100 FTE) – so a small number of economists could feasibly effect change, with new conversations about evaluation documented as occurring widely. It is a relatively new organisation (established 2015), open to learning and actively seeking to improve the way it works and develop better processes. The PHN committed to working with the economists to upskill staff in economic evaluation, a skill deemed necessary and lacking by both senior executives and other staff to improve decision-making. The economists approached the embedded phase well informed of the organisational context and with a willingness to be responsive and adapt to organisational needs. The PHN itself was open and responsive to the eE Program. Staff were highly motivated to increase their skills in economic evaluation to improve commissioning and ‘make a difference’ [22]. The subject matter was seen as responsive and tailored to organisational needs as PHN funders are placing increasing emphasis on the need to demonstrate value (external context) and much of the co-designed work had a focus on impact assessment, highly relevant for all the work of a commissioning organisation [19, 23].

Relational factors

The literature acknowledges the importance of embedded researchers having social and interpersonal skills as well as technical or topic specific expertise. Desirable interpersonal skills include inquisitiveness, receptiveness and enthusiasm as well as communication skills [18, 22, 24]. Our findings are confirmatory - embedded researcher skills and attributes are pivotal in establishing the way of working or interactions during the embedded period, which is in turn pivotal in ensuring success.

However, our research goes further and adds to the importance of relational issues in embedded research. The greatest facilitator for this program was the way of working, the ‘coaching’ or ‘slow science’ [18, 29] approach facilitated by the economists. Whilst previous authors have highlighted a variety of ways of working based on: degrees of objectivity and levels of embeddedness [18, 33, 34]; the ‘role domains’ of knowledge brokering [14, 35] and ‘mechanisms of collaboration for co-production’ [36]; we argue that embedded research will benefit from viewing the work done or interactions through an additional lens of embedded processes as ‘slow science’ [28, 29].

In line with our philosophic commitment to slow science [28], these results build on that thinking by unpacking how a process of capacity building can occur during embedding. Building capability in health economics, or any other expert skill within health or commissioning organisations, calls for a ‘re-invention’ of health service research: moving from the pursuit of grand scientific findings that apply to all services everywhere, to a more nuanced combination of understanding local context and needs, workforce capacity building, design of realistic plans of action, and programs of investigation targeting priority problems of the health service. Instead of bypassing complex challenges in order to adhere to theoretical-methodological strictures, this health service research re-invention ensures we take local complexity as our primary point of departure. However this only becomes possible when we engage healthcare staff as co-researchers and prioritise practical achievement and future-facing learning over the production of discipline-specific knowledge and method-delimited evidence [28].

Strengths and limitations

These results must be considered in light of the following limitations: the eE Program embedded a particular skill set – health economics – in a particular context – a Primary Health Network located in regional Australia. The evaluation of future Local Health District (care provider) sites at which the economist will embed within the NSW Regional Health Partners' footprint may produce quite different results. This is a small scale, qualitative case study. Whilst all program participants from this site were approached to participate in the evaluation, participants self-selected which increases the risk of bias. The evaluation was conducted by a social scientist who also occupied the dual role of program manager. However, to ensure the research remained theoretically and methodologically sound, the embedded Economist program was overseen by the eE Social Science Research Committee.

Conclusion

This article demonstrates the previously reported evaluation components of the study protocol [1] to be effective in identifying barriers and facilitators to embedded research. Our evaluation contributes to the empirical evidence on embedded research in three main ways. First, it examines embedded researchers in a previously unexplored context: a regional Australian primary health commissioning network. Secondly, it describes a detailed design/schema for an embedded program and ideas for improvement that can be adapted by others. Thirdly, it adds to the importance of relational issues in embedded research: the greatest facilitator for this program was the way of working, the coaching or ‘slow science’ approach employed by the economists.

Despite its stretched resources, this embedded research project had impressive outcomes. The organisational outputs, in the main, will not be reflected by academic outputs. Hopefully, this case study is the start of a journey where continued funded engagement will increase HNECCPHN staff skills and the development of research endeavors that are democratically designed and undertaken and also contribute to the academic literature. Elsewhere we outlined the view that healthcare research warrants adopting a more local, contextualised and practical approach in order to tackle health service complexity with requisite levels of fine-grained sensitivity and local relevance [28] .

Overall, this project worked because the organisation was highly receptive and the skills offered were a perfect fit for a commissioning; because of the democratic stance taken by the economists (‘also learning’) and the economists' efforts to encourage the accessibility of their tools and techniques (‘it’s not rocket science’) and; finally, due to the warmth and enthusiasm all parties brought to the engagement.

Availability of data and materials

The datasets generated and analysed during the current study are not publicly available due to the small sample size and size of the participating organisation. The researchers did not receive consent from participants to make data publicly available, as participants required anonymity to ensure they spoke freely about their organisation and the program. To access available data sets, please contact Dr Lisa McFayden, Lisa.mcfayden@health.nsw.gov.au

Abbreviations

ComPrac:

Community of Practice

eE:

embedded Economist

HMRI:

Hunter Medical Research Institute

HNECCPHN:

Hunter New England Central Coast Primary Health Network

IT:

Information Technology

LHD:

Local Health District

NSWRHP:

New South Wales Regional Health Partners

PHN:

Primary Health Network

References

  1. Searles A, Piper D, Jorm C, Reeves P, Gleeson M, Karnon J, et al. Embedding an economist in regional and rural health services to add value and reduce waste by improving local-level decision-making: Protocol for the ‘embedded Economist’program and evaluation. BMC Health Serv. Res. 2021;21(1):1–13.

    Article  Google Scholar 

  2. Searles A, Gleeson M, Reeves P, Jorm C, Leeder S, Karnon J, et al. The Local Level Evaluation of Healthcare in Australia. 2019. https://nswregionalhealthpartners.org.au. .

    Google Scholar 

  3. Henderson J, Javanparast S, MacKean T, Freeman T, Baum F, Ziersch A. Commissioning and equity in primary care in Australia: Views from Primary Health Networks. Health Soc Care Community. 2018;26(1):80–9.

    Article  Google Scholar 

  4. Freeman T, Baum F, Javanparast S, Ziersch A, Mackean T, Windle A. Challenges facing primary health care in federated government systems: Implementation of Primary Health Networks in Australian states and territories. Health Policy. 2021;125(4):495–503.

    Article  Google Scholar 

  5. Anstey M, Burgess P, Angus L. Realising the potential of health needs assessments. Aust Health Rev. 2018;42(4):370–3.

    Article  Google Scholar 

  6. Lane RI, Russell GM, Francis C, Harris M, Powell-Davies G, Jessop R, et al. Evaluation of the primary health networks program. Canberra: Australian Government Department of Health; 2018. 2018 Jul. 95

    Google Scholar 

  7. Wye L, Brangan E, Cameron A, Gabbay J, Klein JH, Pope C. Evidence based policy making and the ‘art’of commissioning – How English healthcare commissioners access and use information and academic research in ‘real life’decision-making: An empirical qualitative study. BMC Health Serv. Res. 2015;15(1):1–12.

    Article  Google Scholar 

  8. Clarke A, Taylor-Phillips S, Swan J, Gkeredakis E, Mills P, Powell J, et al. Evidence-based commissioning in the English NHS: Who uses which sources of evidence? A survey 2010/2011. BMJ Open. 2013;5:e002714.

    Article  Google Scholar 

  9. Addicott R. Challenges of commissioning and contracting for integrated care in the National Health Service (NHS) in England. Aust. J. Prim. Health. 2016;22(1):50–4.

    Article  Google Scholar 

  10. Sabey A. An evaluation of a training intervention to support the use of evidence in healthcare commissioning in England. Int J Evid Based Healthc. 2020;18(1):58–64.

    Article  Google Scholar 

  11. Sanders T, Grove A, Salway S, Hampshaw S, Goyder E. Incorporation of a health economic modelling tool into public health commissioning: Evidence use in a politicised context. Soc. Sci. Med. 2017;186:122–9.

    Article  Google Scholar 

  12. Marshall M, Eyre L, Lalani M, Khan S, Mann S, de Silva D, et al. Increasing the impact of health services research on service improvement: The researcher-in-residence model. J R Soc Med. 2016;109(6):220–5.

    Article  Google Scholar 

  13. Vindrola-Padros C, Eyre L, Baxter H, Cramer H, George B, Wye L, et al. Addressing the challenges of knowledge co-production in quality improvement: Learning from the implementation of the researcher-in-residence model. BMJ Qual Saf. 2019;28(1):67–73.

    Article  Google Scholar 

  14. Wye L, Cramer H, Beckett K, Farr M, Le May A, Carey J, et al. Collective knowledge brokering: The model and impact of an embedded team. Evid. Policy. 2020;16(3):429–52.

    Article  Google Scholar 

  15. Ward V, Tooman T, Reid B, Davies H, Marshall M. Embedding researchers into organisations: A study of the features of embedded research initiatives. Evid. Policy. 2021;17(4):593–614.

    Article  Google Scholar 

  16. Vindrola-Padros C. Can We Re-Imagine Research So It Is Timely, Relevant and Responsive? Comment on "Experience of Health Leadership in Partnering with University-Based Researchers in Canada: A Call to 'Re-Imagine' Research.". Int J Health Policy Manag. 2021;10(3):172–5.

    PubMed  Google Scholar 

  17. Coates D, Mickan S. Challenges and enablers of the embedded researcher model. J Health Organ Manag. 2020. https://doi.org/10.1108/JHOM-02-2020-0043.

  18. Churruca K, Ludlow K, Taylor N, Long JC, Best S, Braithwaite J. The time has come: Embedded implementation research for health care improvement. J. Eval. Clin. Pract. 2019;25(3):373–80.

    Article  Google Scholar 

  19. Vindrola-Padros C, Pape T, Utley M, Fulop NJ. The role of embedded research in quality improvement: A narrative review. BMJ Qual Saf. 2017;26(1):70–80.

    Article  Google Scholar 

  20. Marshall M, Pagel C, French C, Utley M, Allwood D, Fulop N, et al. Moving improvement research closer to practice: The Researcher-in-Residence model. BMJ Qual Saf. 2014;23(10):801–5.

    Article  Google Scholar 

  21. Varallyay NI, Langlois EV, Tran N, Elias V, Reveiz L. Health system decision-makers at the helm of implementation research: Development of a framework to evaluate the processes and effectiveness of embedded approaches. Health Res. Policy Syst. 2020;18(1):1–12.

    Article  Google Scholar 

  22. Cheetham M, Wiseman A, Khazaeli B, Gibson E, Gray P, Van der Graaf P, et al. Embedded research: A promising way to create evidence-informed impact in public health? J Public Health. 2018;40(suppl_1):i64–70.

    CAS  Article  Google Scholar 

  23. Walley J, Khan MA, Witter S, Haque R, Newell J, Wei X. Embedded health service development and research: Why and how to do it (a ten-stage guide). Health Res. Policy Syst. 2018;16(1):67.

    Article  Google Scholar 

  24. Gradinger F, Elston J, Asthana S, Martin S, Byng R. Reflections on the researcher-in-residence model co-producing knowledge for action in an Integrated Care Organisation: A mixed methods case study using an impact survey and field notes. Evid Policy. 2019;15(2):197–215.

    Article  Google Scholar 

  25. Wye L, Cramer H, Carey J, Anthwal R, Rooney J, Robinson R, et al. Knowledge brokers or relationship brokers? The role of an embedded knowledge mobilisation team. Evid Policy. 2019;15(2):277–92.

    Article  Google Scholar 

  26. Rapport F, Clay-Williams R, Churruca K, Shih P, Hogden A, Braithwaite J. The struggle of translating science into action: Foundational concepts of implementation science. J. Eval. Clin. Pract. 2018;24(1):117–26.

    Article  Google Scholar 

  27. Pain R, Askins K, Banks S, Cook T, Crawford G, Crookes L, et al. Mapping Alternative Impact: Alternative approaches to impact from co-produced research. 2016. https://eprints.gla.ac.uk/115470/1/115470.pdf. .

    Google Scholar 

  28. Jorm C, Iedema R, Piper D, Goodwin N, Searles A. “Slow science” for 21st century healthcare: Reinventing health service research that serves fast-paced, high-complexity care organisations. J Health Organ Manag. 2021;35: 701–16.

  29. Stengers I. Another science is possible: A manifesto for slow science. Cambridge: Wiley; 2018.

    Google Scholar 

  30. QSR International Pty Ltd. NVivo (Version 12). 2018. https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home. Accessed 2020-2022.

  31. Yukhymenko MA, Brown SW, Lawless KA, Brodowinska K, Mullin G. Thematic Analysis of Teacher Instructional Practices and Student Responses in Middle School Classrooms with Problem-Based Learning Environment. Glob. Educ. Rev. 2014;1(3):93–110.

    Google Scholar 

  32. Willmott L, White B, Yates P, Mitchell G, Currow DC, Gerber K, et al. Nurses’ knowledge of law at the end of life and implications for practice: A qualitative study. Palliat. Med. 2020;34(4):524–32.

    Article  Google Scholar 

  33. Coates D, Mickan S. The embedded researcher model in Australian healthcare settings: Comparison by degree of “embeddedness.”. Transl Res. 2020;218:29–42.

    CAS  Article  Google Scholar 

  34. Duggan JR. Critical friendship and critical orphanship: Embedded research of an English local authority initiative. Manag. Educ. 2014;28(1):12–8.

    Article  Google Scholar 

  35. Glegg SM, Hoens A. Role domains of knowledge brokering: A model for the health care setting. J Neurol Phys Ther. 2016;40(2):115–23.

    Article  Google Scholar 

  36. Heaton J, Day J, Britten N. Collaborative research and the co-production of knowledge for practice: An illustrative case study. Implement Sci. 2015;11(1):1–10.

    Article  Google Scholar 

Download references

Acknowledgements

We acknowledge the participation of the Hunter New England Central Coast Primary Health Network staff and Hunter Medical Research Institute embedded economists.

We also acknowledge the efforts and help of the following members of the embedded Economist Steering Committee who provide oversight, advice and direction on the program design and implementation: Mr Peter Johnson (consumer representative), Ms Wendy Keech (Director, Health Translation SA), Mr Michael Di Rienzo (Chief Executive, Hunter New England Local Health District), Dr Antonio Penna, (Executive Director, Office for Health and Medical Research) and Professor Rachael Morton (Director, Health Economics - NHMRC Clinical Trials Centre, Faculty of Medicine and Health, The University of Sydney).

Funding

This eE Program pilot study is funded by the Australian Government’s Medical Research Future Fund (MRFF) as part of the Rapid Applied Research Translation program. (MRF9100005). The study protocol has been independently peer reviewed by the funding body. The funding entity had no other role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

All authors have provided input to, reviewed, edited and approved the final version. Conceptualization: CJ, RI, NG, AS and DP. Funding acquisition: CJ Methodology: CJ, RI, NG and DP. Project administration: CJ, DP and LM. Writing - original draft: DP. Writing - review & editing: DP, CJ, RI, NG, AS and LM.

Corresponding author

Correspondence to Lisa McFayden.

Ethics declarations

Ethics approval and consent to participate

Ethics approval for the embedded component and evaluation for the HNECCPHN site was granted by the University of New England Ethics Committee, approval number: HE19-196. Ethics approval for the education component for the HNECCPHN site was granted by the Hunter New England Research Ethics Committee approval number: 2020/ETH00012. These approvals were ratified and registered with the University of Newcastle Research Integrity Unit number: H-2020-0234. Written informed consent was obtained from all participants.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Piper, D., Jorm, C., Iedema, R. et al. Relational aspects of building capacity in economic evaluation in an Australian Primary Health Network using an embedded researcher approach. BMC Health Serv Res 22, 813 (2022). https://doi.org/10.1186/s12913-022-08208-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-022-08208-7

Keywords

  • Health economics
  • Economic evaluation
  • Program evaluation
  • Embedded researcher
  • Health services research
  • Value-based healthcare
  • Primary care
  • Commissioning, Australia