- Open Access
Process evaluation of a cluster randomised controlled trial to improve bronchiolitis management – a PREDICT mixed-methods study
BMC Health Services Research volume 21, Article number: 1282 (2021)
Bronchiolitis is the most common reason for hospitalisation in infants. All international bronchiolitis guidelines recommend supportive care, yet considerable variation in practice continues with infants receiving non-evidence based therapies. We developed six targeted, theory-informed interventions; clinical leads, stakeholder meeting, train-the-trainer, education delivery, other educational materials, and audit and feedback. A cluster randomised controlled trial (cRCT) found the interventions to be effective in reducing use of five non-evidence based therapies in infants with bronchiolitis. This process evaluation paper aims to determine whether the interventions were implemented as planned (fidelity), explore end-users’ perceptions of the interventions and evaluate cRCT outcome data with intervention fidelity data.
A pre-specified mixed-methods process evaluation was conducted alongside the cRCT, guided by frameworks for process evaluation of cRCTs and complex interventions. Quantitative data on the fidelity, dose and reach of interventions were collected from the 13 intervention hospitals during the study and analysed using descriptive statistics. Qualitative data identifying perception and acceptability of interventions were collected from 42 intervention hospital clinical leads on study completion and analysed using thematic analysis.
The cRCT found targeted, theory-informed interventions improved bronchiolitis management by 14.1%. The process evaluation data found variability in how the intervention was delivered at the cluster and individual level. Total fidelity scores ranged from 55 to 98% across intervention hospitals (mean = 78%; SD = 13%). Fidelity scores were highest for use of clinical leads (mean = 98%; SD = 7%), and lowest for use of other educational materials (mean = 65%; SD = 19%) and audit and feedback (mean = 65%; SD = 20%). Clinical leads reflected positively about the interventions, with time constraints being the greatest barrier to their use.
Our targeted, theory-informed interventions were delivered with moderate fidelity, and were well received by clinical leads. Despite clinical leads experiencing challenges of time constraints, the level of fidelity had a positive effect on successfully de-implementing non-evidence-based care in infants with bronchiolitis. These findings will inform widespread rollout of our bronchiolitis interventions, and guide future practice change in acute care settings.
Australian and New Zealand Clinical Trials Registry: ACTRN12616001567415.
Bronchiolitis is the most common respiratory condition affecting infants. It is the leading cause of admission into hospital in infants less than 1 year of age in developed countries . Management is well defined  with international guidelines consistently recommending respiratory and hydration support [2,3,4,5]. Despite high quality evidence of no benefit and potential harm from the use of chest x-ray (CXR), salbutamol, antibiotics, glucocorticoids and adrenaline, these five therapies continue to be widely used. In Australia and New Zealand, data from over 3400 presentations to seven hospitals show that at least one in five therapies were used at least once in 27 to 48% of bronchiolitis admissions . These data are consistent with comparisons in the United Kingdom, North America, and Europe  and highlight the gap between evidence and current clinical practice that exists internationally.
Implementation research is the scientific study of methods to promote the uptake of research into routine practice, including the development and evaluation of interventions designed to reduce the evidence-practice gap . Dissemination of clinical practice guidelines alone is seldom sufficient to drive change in practice with more active and targeted strategies required for change to occur . Using theories of behaviour change and addressing both the barriers and enhancers of recommended practice are more likely to be effective [10, 11]. De-implementation or reducing the use of low-value healthcare has received less attention than implementation and is often considered more difficult. Healthcare systems are urged to embrace learnings from implementation science initiatives to date, to avoid repeating previous efforts and supporting development of a de-implementation international network .
In response to the identified practice variation in the treatment of infants with bronchiolitis, and recognising the importance of reducing the use of low-value and inappropriate therapies , targeted theory-informed interventions were developed aiming to increase compliance with five key recommendations from the Australasian Bronchiolitis Guideline . Interventions were developed using a stepped approach, addressing factors influencing bronchiolitis management previously identified during qualitative clinician interviews  utilising the Theoretical Domains Framework (TDF) . This validated framework incorporates a wide range of behaviour change theories for use in implementation research with demonstrated track record in explanatory and predictive powers across healthcare settings . Guidance has been developed to inform the choice of behaviour change techniques most likely to tackle identified issues [17, 18]. The six bronchiolitis interventions chosen and developed were: 1. Clinical leads; 2. Stakeholder meeting; 3. Train-the-trainer workshop; 4. Educational intervention delivery; 5. Additional educational and promotional materials; 6. Audit and feedback. Table 1 details the bronchiolitis interventions and causal assumptions. Interventions were evaluated in an international multi-centre cluster randomised controlled trial (cRCT) and demonstrated effectiveness at improving bronchiolitis management by 14.1% and de-implementing unnecessary and low-value management . Our stepped design followed the recently described Choosing Wisely De-implementation Framework .
A process evaluation was conducted alongside the cRCT. While RCTs are accepted as gold standard for evaluating intervention effectiveness , RCT results alone do not provide information on what worked, how, and why. Process evaluation of complex interventions (those having multiple active strategies) as in our cRCT, is required to open the “black box” on what may or may not have worked and why . Evaluations undertaken alongside a trial can clarify the degree of implementation fidelity, how and why it worked (or didn’t work), and how interventions could be improved for subsequent programmes. Process evaluations are particularly important in multi-site trials where the same intervention is delivered yet received and utilised differently, and assists the interpretation of outcome results . A systematic review of process evaluations found weak evidence-base in implementation studies, recommending using mixed-methods, theory-guided design, occurring both during and following implementation . Integrating process and outcome analysis allows evaluations to explore possible associations between implementation strategies, delivery, receipt, and outcomes on effectiveness .
Process evaluation in acute paediatric settings, and in de-implementation are rare. However process evaluation is vital for improving broad dissemination of successful de-implementation interventions or identifying areas for improvement. This evaluation aims to provide insight in to the interventions and delivery, experiences of clinical leads, and how these findings relate to the bronchiolitis cRCT results. Specifically, the objectives being:
To evaluate the degree to which the bronchiolitis interventions were delivered as planned (fidelity, dose and reach).
To explore clinical lead perceptions of the interventions, execution of these in a real clinical setting and acceptability (participant perspective).
To explore relationships between intervention fidelity and effectiveness data from our cRCT results, drawing lessons for future de-implementation projects.
A pre-specified mixed-methods process evaluation was conducted alongside the Paediatric Research in Emergency Departments International Collaborative (PREDICT) bronchiolitis cRCT, the protocol and results are published elsewhere [19, 24]. This has been guided by a framework for process evaluation of cRCTs , the United Kingdom Medical Research Council (MRC) recommendations for complex interventions , and a systematic review of process evaluations in knowledge translation research . Additional file 1: Appendix Table 1 details components and methods utilised in the process evaluation.
A quantitative component measured intervention fidelity, the degree to which the intervention was delivered as intended, dose and reach. A qualitative component examined clinical leads perceptions of the interventions via an online questionnaire. Both qualitative and quantitative findings are viewed together when interpreting findings.
Study setting and participants
A total of 26 hospitals (clusters) across Australia and New Zealand were recruited . Hospitals were randomised to intervention (n = 13) or control (n = 13). The emergency department (ED) and paediatric inpatient unit clinicians (nursing and medical) for each of the 13 intervention hospitals and clinical leads (nursing and medical in both departments) responsible for delivering the interventions were participants in the process evaluation. Intervention hospitals received targeted, theory-informed interventions. Control hospitals received an electronic and printed copy of the complete Australasian Bronchiolitis Guideline , representing usual practice for guideline dissemination at the time. Control hospitals received all interventions at the completion of the study. The implementation period was the Australian and New Zealand bronchiolitis season, 1st May 2017 to 30th November 2017.
Intervention hospitals received interventions targeting nursing and medical clinicians who managed infants with bronchiolitis in the ED and paediatric inpatient units (Table 1). Interventions were developed using a stepped theory-informed approach: 1) five key evidence-based recommendations were identified from the Australasian Bronchiolitis Guideline , 2) a qualitative study of clinicians in Australia and New Zealand identified factors perceived to influence the treatment of infants with bronchiolitis  using the TDF , 3) findings from this study were mapped to behaviour change techniques most likely to effect change for the identified factors [17, 18], and 4) targeted interventions were developed to operationalise these behaviour change techniques for the ED and paediatric inpatient units by considering the feasibility, local relevance, and acceptability of the intervention components. Additional file 1: Appendix Table 2 details how the bronchiolitis interventions were rolled out and mapped to the Template for Intervention Description and Replication (TIDieR) checklist .
Hospital baseline demographics were collected. Quantitative process evaluation data (intervention fidelity) was collected by clinical leads at intervention hospitals (ED and paediatric inpatient units clinical leads) regularly during the implementation period. Qualitative data was collected from clinical leads via an online questionnaire administered on study completion.
Data relating to intervention fidelity, the degree to which interventions were delivered as intended, dose and reach, was collected (minimum monthly) by clinical leads via online entry into a training log (quantitative). Table 2 details the unique fidelity scoring system developed to measure fidelity, with each intervention having the same percentage weighting in the final score. Educational sessions were recorded noting the number of clinicians attending, duration, frequency, who led the session, and modifications made to the educational PowerPoint presentation. Monthly audits (n = 7) of the first 20 bronchiolitis presentations (n = 10 discharged from ED; n = 10 discharged from paediatric inpatient unit) were completed by intervention hospitals. A report was provided with tabulated and graphical compliance by month for their hospital’s total compliance (for all five guideline recommendations), each of the five guideline recommendations, comparisons with previous audits and baseline data, and their hospital benchmarked anonymously to the top performing intervention hospital. Audit and feedback cycle frequency, dissemination methods (written, verbal), frequency of audit report distribution and action planning in light of audit results were recorded. Use of promotional and teaching materials were noted. Intervention hospitals were requested to appoint four clinical leads for the duration of the implementation period (one nursing and one medical clinical lead from ED and paediatric inpatient unit), with guidance provided on suitable clinical lead traits. Clinical leads attended the train-the-trainer day, led delivery of interventions and co-ordinated audit and feedback. Number of clinical leads attending the train-the-trainer day, and whether they remained for the duration of the study were rated. Adherence to completing training logs was assessed at least monthly with regular reminders sent to clinical leads from the research support team.
Qualitative data from the clinical lead questionnaire was collected at study end, gaining feedback from clinical leads on interventions and their delivery. Integrating quantitative and qualitative data with effectiveness findings was undertaken to assist with analysis and interpretation . Additional file 1: Appendix Table 1 describes research questions for each process evaluation domain.
Recruitment and reach
Heterogeneity of clusters (intervention and control hospitals) was assessed quantitatively by comparing hospital type, annual bronchiolitis presentation numbers, staffing numbers and baseline compliance to the five key recommendations from the Australasian Bronchiolitis Guideline. Bronchiolitis intervention reach to clinicians was evaluated through information provided from training logs which clinical leads maintained over the implementation period.
Data on intervention delivery was analysed descriptively. Individual intervention hospital results are presented as percentage compliance for each intervention and total percentage compliance for all six interventions. A combined hospitals result is presented as percentage compliance for each intervention (mean; standard deviation (SD)) and percentage total compliance for all six interventions (mean; SD). A scoring system was created by the research group to capture fidelity components for the six interventions, with each intervention equally weighted in the total mean fidelity score (Table 2).
We plotted change in primary outcome compliance with all five bronchiolitis guideline recommendations (between 2014/2015 versus 2017 implementation year) for each individual intervention hospital (cluster level) and total mean fidelity score by hospital to assess a possible relationship.
Response to intervention
Clinical lead questionnaires were analysed qualitatively using thematic analysis to give insights in to perceptions of the interventions, acceptability, delivery, and receipt. Findings from these questionnaires are reported descriptively using quotations.
Recruitment of clusters and cRCT findings
The outcomes of the cRCT are reported in detail elsewhere . In summary, 26 hospitals (Australia = 20; New Zealand = 6) were randomised (intervention group = 13; control group = 13), including 7 tertiary paediatric hospitals (all in Australia). No hospitals withdrew following randomisation. Intervention and control hospitals were well balanced at baseline (Table 3).
The primary outcome was compliance with the Australasian Bronchiolitis Guideline during the first 24 h of care (acute care period), with no use of CXRs, salbutamol, antibiotics, glucocorticoids and adrenaline. Implementation year data was collected on 3727 infants. Compliance with the guideline recommendations was 85.1% (95%CI 82.6–89.7%) in intervention hospitals versus 73.0% (95%CI 65.3–78.8%) in control hospitals, with an adjusted risk difference of 14.1% (95% CI 6.5–21.7%, p < 0.001) favouring the intervention hospitals.
Implementation fidelity, dose and reach to clusters and individuals
All 13 intervention hospitals received the six bronchiolitis interventions as per study protocol with total intervention hospital fidelity scores ranging from 55 to 98% (mean 78%; SD 13%) (Table 4).
Intervention hospitals were requested to identify four clinical leads, one nursing and one medical lead in each of ED and paediatric inpatient units for study duration. Twelve (92%) of 13 hospitals achieved this, one hospital had three clinical leads for the duration (fidelity mean 98%; SD 7%). Key clinical lead tasks are detailed in Table 1. Of the 55 clinical leads 42 (76%) responded to the qualitative questionnaire (questionnaire findings are reported descriptively using quotations in italics). Table 5 details characteristics of clinical lead who responded.
Clinical leads reported positively on their role, with teamwork in both their department and inter-departmentally viewed as valuable.
Close working relationships between paediatric and ED members of the study team locally made rolling out a new clinical policy much easier. (Paediatric inpatient unit, medical).
I found the use of a clinical lead as "go to" person on the floor was useful and everyone quickly learnt who to go to from the education sessions that we ran with most of the medical and nursing staff. (ED, nursing).
Time constraints for education, and trying to influence some clinicians’ practise were highlighted as challenges.
Educating in a busy department is challenging. (ED, nursing).
Changing some traditionalists mindset [what was challenging]. (ED, medical).
Research study group members met with clinical leads at each intervention hospital at the beginning of the study to discuss the Australasian Bronchiolitis Guideline, international and local variation in bronchiolitis management, review their hospital’s audit of bronchiolitis compliance (n = 40 infants with bronchiolitis), and discuss any anticipated local barriers, with the aim to gain hospital buy-in. Forty-seven (fidelity mean 90%, SD 13%) of the 52 clinical leads attended stakeholder meetings, with a meeting held at each hospital (100%).
All clinical leads were requested and funded to attend a workshop facilitated by research group members and credible experts in bronchiolitis and implementation science. Table 1 details workshop content. The bronchiolitis expert role modelled delivery of the educational PowerPoint intervention, emphasising important key-points targeting behaviours and beliefs we were influencing. Forty-two (fidelity mean 81%; SD 23%) clinical leads attended the workshop, with every intervention hospital having at least one clinical lead attend. Feedback on the workshop was overwhelmingly positive.
Educational intervention delivery
A PowerPoint presentation with messages addressing key findings from the qualitative study, using behavioural change techniques most likely to effect change was provided. Key messages needing to be conveyed to clinicians were highlighted. Clinical leads aimed to train 80% of clinicians with the PowerPoint presentation within the first month. Five (38.5%) hospitals achieved this, five (38.5%) trained > 50% of staff, and three (23%) trained < 50% staff. Clinical leads mentioned many further informal educational sessions conducted which were not quantified e.g. bedside clinical teaching; at nursing huddles; during one-on-one conversations. All hospitals continued educating clinicians for the duration of the implementation period. Positive feedback on the presentation was reported. Some clinical leads modified the presentation to enable shorter teaching sessions, keeping key message slides. Elements of the presentation felt beneficial were slides detailing evidence supporting the five evidence-based recommendations, and key message slides (with red stickers emphasising importance).
The most powerful elements were the statistics on each recommendation and why it [therapies / management processes] makes no difference. (Paediatric inpatient unit, nursing).
The slides with ‘red stickers’ [five key slides detailing why not to use therapies / management processes] and adverse effects of interventions. (ED, medical).
Use of other educational materials
Additional materials included: a video demonstrating a clinician discussing bronchiolitis with a family; fact sheets (evidence behind no use of CXR, salbutamol and antibiotics); and promotional materials (posters) with use ranging from 40 to 90%. The clinician video received mixed reviews, with some leads finding it useful for junior staff, and others not using it due to time constraints.
Excellent for medical staff if junior, and not yet developed their own “spiel”. (ED, medical).
Fact sheets were particularly useful for senior medical clinicians whom clinical leads were struggling to influence. Despite these being designed for health professionals, some clinical leads found them helpful for parents.
These [evidence sheets] were more useful for senior medical staff that questioned the basis for the recommendations. (ED, medical).
Two posters (detailing guideline recommendations and over-use of therapies) were well utilised. Feedback was positive, with posters displayed in both clinical (for patients/families and clinicians) and non-clinical areas (for staff; displayed on educational boards, in staff tearoom and staff bathroom).
Posters, particularly the recommendations poster were placed in the doctor write-up area and in the paediatric resus area. Proved to be valuable and effective reminders of what not to do. (ED, medical).
They [posters] were very well received. They were pretty well visible from every cubicle. (ED, nursing).
Audit and feedback
Monthly audits (n = 7) of 20 bronchiolitis presentations were completed by each intervention hospital. Individualised hospital audit reports were produced detailing tabulated and graphical compliance results by month for use of chest x-ray, salbutamol, glucocorticoids, antibiotics and adrenaline, temporal trends, and anonymised benchmarking against the top performing hospital. Dissemination of this report in verbal and written format to clinicians was requested, with action planning and target setting encouraged. All hospitals completed seven audits (100%) with results disseminated back to staff in a variety of methods and frequency. Overall fidelity for audit and feedback ranged from 39 to 100% (fidelity mean 65%; SD 20%) (Table 4).
Clinical leads were positive about the usefulness of audit and feedback, although time to complete audits was challenging. Variation in feedback dissemination strategies was helpful for clinicians, and discussion between departments at handovers or during education sessions was viewed positively.
Feedback was used to congratulate improvement, but also to suggest re-focus on our treatment after the initial implementation "excitement" faded. (ED, nursing).
Feedback was clear, concise, easily understood and distributed by email effectively. Informal discussion regarding individual cases occurred. Results in general were discussed at combined ED and Paediatrics educational sessions. (ED, medical).
It was a good reminder about keeping teaching up to date and allowed feedback to staff. It also allowed us to identify misses - a locum paediatric House Officer was not informed of our guidelines. (Paediatric inpatient unit, medical).
All 13 intervention hospitals utilised the six interventions with good adherence to study protocol. This was achieved within clinical lead’s non-clinical time and existing local educational programmes. Figure 1 details bronchiolitis intervention fidelity and change in compliance with all five bronchiolitis guideline recommendations between 2014/2015 to 2017. There is no obvious relationship demonstrated between fidelity (i.e. protocol adherence) and guideline compliance. The vast majority of hospitals (n = 12; 92%) improved compliance with bronchiolitis guideline recommendations, with one hospital having reduced compliance despite achieving 80% intervention fidelity. The hospital with the second lowest fidelity score had the second highest compliance improvement, and the hospital with the highest fidelity score had the highest compliance improvement.
Clinical leads were positive regarding the success of the interventions in implementing the five bronchiolitis recommendations:
Absolutely. There has been a culture shift. It was challenging initially, but gradually changed the mindset. (ED, medical).
Implementing the five recommendations was straight forward and successful. (Paediatric inpatient unit, medical).
Utilising educators in both departments using the same information meant consistency and increased effectiveness. (Paediatric inpatient unit, nursing).
Time challenges in educating staff, and maintaining education to cover staff rotations was raised. Medical clinicians were considered harder to influence than nursing, with some nurses being uncomfortable questioning medical decisions:
Physician dependent. Some physicians excellent at implementing recommendations; some not so. (ED, nursing).
Initially widespread acceptance but then old habits came creeping back in through older consultants. (ED, medical).
I feel the nurses are getting the information but still not comfortable asking the medical team why they are charting salbutamol. (Paediatric inpatient unit, nursing)
Minor modifications of the interventions were allowed, with request that key messages remained. Some clinical leads modified the presentation, with 14 (33%) condensing slides to meet time constraints, leaving key messages unchanged.
Process evaluation of de-implementation is rare, yet vital for interventions from successful experimental evaluations to be implemented into real-world clinical practice. This is particularly important where de-implementation is required as this is believed harder than implementation, with calls for more rigorous de-implementation research to be undertaken . Our process evaluation of a cRCT which successfully utilised interventions to de-implement non-evidence-based bronchiolitis management in a real-world clinical setting addresses this deficit. We utilised recommendations from two process evaluation frameworks [20, 23], and a systematic review of process evaluations . This mixed-methods process found that our targeted, theory-informed interventions were delivered with moderate-to-high fidelity (55 to 98%) and were well received by clinical leads. The interventions reached target clinicians and were found to be acceptable. Main challenges were time constraints of delivering interventions within the everyday demands of clinical practice.
The effectiveness of our bronchiolitis interventions has been robustly assessed via a multi-centre cRCT . In this trial of 26 hospitals, with data from 3727 infants, our interventions improved bronchiolitis management by 14.1% (95% CI 6.5–21.7%) in intervention hospitals compared to control hospitals who undertook usual dissemination practices of the Australasian Bronchiolitis Guideline . This absolute change is at the upper end of improvements shown in cRCTs . Our process evaluation, along with our cRCT results confirm our interventions effectiveness in improving treatment of infants with bronchiolitis, and importantly were acceptable to clinicians who rolled it out within existing clinical and non-clinical time, in a real-world clinical environment. These results show promise for widespread use of our intervention where drivers of bronchiolitis management are similar.
Complex interventions, those having multiple interacting components are increasingly used to tackle problems such as evidence-based management of common conditions like bronchiolitis . Process evaluation aims to open the “black box” of implementation studies to tease out why or why not an intervention might work. While they cannot realistically answer all effects of complex interventions, answering key questions well is more valuable than attempting to answer many questions with less certainty . Using a mixed-methods design is encouraged as helps to capture what happened and why, and using theory enables comparisons between studies . Our process evaluation methodology addresses these requests.
Our interventions were implemented and delivered as intended, with 13 intervention hospitals utilising interventions similarly in terms of content, dose and delivery. Importantly, the sequence of intervention delivery was the same in each hospital. Our scoring system ensured accurate and transparent fidelity measurement. The six interventions had equal weighting without assuming one intervention was more effective than another (Table 2). Fidelity use was likely higher than reported, as all intervention use was unlikely to be captured. Data collection timing can be a methodological weakness in process evaluations, with recommendations for data collection pre, during and post the study . A strength of our study was that quantitative fidelity data was collected regularly throughout the study, although some clinical leads required more frequent reminders than a monthly email to complete the training log. Qualitative data was collected immediately after the implementation period, minimising recall bias and reducing challenges of retrospective data collection. Mixed-methods evaluation adds richness to our findings, with interpretation enhanced by the triangulation of qualitative and quantitative data with intervention effectiveness data.
Total fidelity scores ranged between intervention hospitals. All scored > 55% and two thirds scored over 80%, affirming positive reach and receipt of interventions. This represents moderate-to-high fidelity, with others suggesting that 80 to 100% intervention adherence represents ‘high’ fidelity, 51 to 79% represents ‘moderate’ fidelity, and < 50% ‘low’ fidelity [28, 29]. The lack of clear interaction between intervention fidelity and compliance needs cautious interpretation due to the small number of hospitals (Fig. 1). One hospital had a small reduction in overall compliance with an 80% fidelity score, while the hospital with the second highest increase in compliance had the second lowest fidelity score. All hospitals embraced the interventions positively, achieving > 55% fidelity, positive qualitative findings from clinical leads, and a positive cRCT result. Despite no clear interaction between fidelity and improved compliance, we believe a fidelity threshold effect possibly exists. Our positive cRCT results and moderate-to-high fidelity in a real-world setting, suggests that using our interventions with intensity, improvement in compliance with bronchiolitis management can be expected .
Intervention hospitals aimed to train 80% of clinicians within the first month, as educating at the end of the bronchiolitis season would have minimal impact. Increased autumn workloads in acute paediatrics and number of staff in large departments, particularly EDs, likely influenced the lower initial education rate. Only five (38%) hospitals achieved the target, although a further five (38%) hospitals achieved > 50% of clinicians trained, and training continued over the duration of the bronchiolitis season. This suggests that a lower education target with continued education is effective. Future recommendations to increase education would include starting bronchiolitis education earlier in the season, mandating education, and utilising online education tools.
All intervention hospitals completed seven audit and feedback cycles, with differences in dissemination of results. The provision of real-time data was viewed favourably to monitor progress and redirect education. Anonymously benchmarking individual hospitals against the top performing hospital added competition, with some hospitals striving to be the top performing hospital. The literature indicates that showing clinicians performance data has improved guideline compliance in other medical conditions [30, 31], with systematic reviews of audit and feedback suggesting small but potentially important improvements (RD 4.3%; IQR 0.5 to 16.0%) .
The clinical leads’ role was clearly defined with their importance reinforced during the train-the-trainer day. Clinical lead fidelity was high across all hospitals (mean 98%; SD 6.9%). Clinical leads either alone or in combination with other interventions has shown effectiveness in meta-analysis (n = 24 RCTs), with median absolute improvement in care of 10.8% (IQR 3.5 to 14.6%) . Successful leads have been identified as having key attributes: influence, ownership, physical presence, grit, persuasiveness, and participative leadership . Although we did not formally assess clinical lead attributes, we believe many demonstrated these throughout the study.
All clinical leads were invited to complete a questionnaire at study end, giving feedback on the interventions. Similar process evaluation of a complex intervention involving several departments in multiple hospitals, had the majority of questionnaires completed by one clinician on behalf of each hospital, potentially gaining only a single viewpoint . Inviting each of our clinical leads to complete the questionnaire ensured opportunity for all views to be heard. A 76% response rate allowed adequate reflection of views from most clinical leads. Questionnaire findings were overwhelmingly positive regarding interventions. Suggestions for the clinician video and PowerPoint presentation to be shortened are helpful and realistic. Clinical leads were pragmatic by reducing presentation content, ensuring important key messages remained, and using the video judiciously. While we requested hospitals utilise all interventions, we accepted the impact on intervention delivery of busy hospitals over autumn and winter months, and appreciated honesty in describing intervention modifications. Promotional posters had overwhelmingly positive feedback. A systematic review of printed educational materials used alone or compared to no intervention demonstrated small effect, with effectiveness as part of a multifaceted intervention being uncertain . Our positive feedback and positive cRCT result suggest that posters, being easy and low-cost to produce, be considered as part of future intervention packages.
A strength of our study is that no hospitals withdrew post randomisation, suggesting clinicians’ and hospitals’ commitment to reducing low-value care when treating infants with bronchiolitis, and that interventions were appropriate and realistic for the real-world setting of de-implementation in acute paediatrics. We also acknowledge study limitations. Clinical lead feedback was positive, but there may be response bias as perspectives of the leads who did not respond is unknown. We did not obtain feedback from clinicians who received study interventions from clinical leads. However, all clinical leads were also practicing clinicians, and we surmised their feedback would be similar to their clinician colleagues.
Attempting to identify one intervention as superior to the others is tempting, but not possible as the six interventions were delivered as a package and not independent. However, opening the ‘black box’ on our de-implementation study has given insight into what worked, why, and potential barriers to implementation. Findings are important as we scale-up dissemination of our interventions to improve the treatment of infants with bronchiolitis, with modifications addressing time challenges of education delivery. A co-ordinated approach, utilising national and international networks to disseminate our findings and interventions being required to optimise translation .
This process evaluation found our bronchiolitis interventions were delivered within the ED and paediatric context as intended, received positively, with good reach. All six interventions were undertaken, resulting in improvement in the treatment of infants with bronchiolitis. Time constraints for intervention delivery posed challenges. However, clinical leads were adaptable to ensure key intervention components were delivered. These results provide guidance to researchers and clinicians in utilising our interventions in ED and paediatric inpatient settings and have wider implications for de-implementation science in general.
Availability of data and materials
Florin TA, Plint AC, Zorc JJ. Viral bronchiolitis. Lancet. 2017;389(10065):211–24.
Ralston SL, Lieberthal AS, Meissner HC, Alverson BK, Baley JE, Gadomski AM, et al. Clinical practice guideline: the diagnosis, management, and prevention of bronchiolitis. Pediatrics. 2014;134(5):e1474–502.
Ricci V, Delgado Nunes V, Murphy MS, Cunningham S, Guideline Development G, Technical T. Bronchiolitis in children: summary of NICE guidance. BMJ. 2015;350(350):h2305.
Scottish Intercollegiate Guidelines Network. Bronchiolitis in children: a national care guideline. Edinburgh: Scottish Intercollegiate Guidelines Network; 2006.
O'Brien S, Borland ML, Cotterell E, Armstrong D, Babl FE, Bauert P, et al. Australasian bronchiolitis Guideline. J Paediatr Child Health. 2018;55(1):42–53.
Oakley E, Brys T, Borland M, Neutze J, Phillips N, Krieser D, et al. Medication use in infants admitted with bronchiolitis. Emerg Med Australas. 2018;30(3):389–97.
Schuh S, Babl FE, Dalziel SR, Freedman SB, Macias CG, Stephens D, et al. Practice variation in acute bronchiolitis: a pediatric emergency research networks study. Pediatrics. 2017;140(6):e20170842.
Eccles M, Mittman B. Welcome to Implementation Science. Implement Sci. 2006;1:1.
Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003;362(9391):1225–30.
Baker R, Camosso‐Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N, Wensing M, Fiander M, Eccles MP, Godycki‐Cwirko M, van Lieshout J, Jäger C. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;(4):CD005470. https://doi.org/10.1002/14651858.CD005470.pub3.
Michie S, Johnston M, Francis J, Hardeman W, Eccles M. From theory to intervention: mapping theoretically derived Behavioural determinants to behaviour change techniques. Appl Psychol. 2008;57(4):660–80.
Grimshaw JM, Patey AM, Kirkham KR, Hall A, Dowling SK, Rodondi N, et al. De-implementing wisely: developing the evidence base to reduce low-value care. BMJ Qual Saf. 2020;29(5):409–17.
Cassell C, Guest J. Choosing wisely: helping physicians and patients make smart decisions about their care. JAMA. 2012;307(17):1801–2.
Haskell L, Tavender EJ, Wilson C, Babl F, Sheridan N, Oakley E, et al. Understanding factors that contribute to variations in bronchiolitis management in acute care settings: a qualitative study in Australia and New Zealand using the theoretical domains framework. BMC Pediatr. 2020;20(1):189.
Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.
Cane J, O'Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37.
Cane J, Richardson M, Johnston M, Ladha R, Michie S. From lists of behaviour change techniques (BCTs) to structured hierarchies: comparison of two methods of developing a hierarchy of BCTs. Br J Health Psychol. 2015;20(1):130–50.
Atkins L, Francis J, Islam R, O'Connor D, Patey A, Ivers N, et al. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77.
Haskell L, Tavender EJ, Wilson CL, O'Brien S, Babl FE, Borland ML, et al. Effectiveness of targeted interventions on treatment of infants with bronchiolitis: a randomized clinical trial. JAMA Pediatr. 2021;175(8):797-806.
Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.
Oakley A, Strange V, Bonell C, Allen E, Stephenson J, Team RS. Process evaluation in randomised controlled trials of complex interventions. BMJ. 2006;332(7538):413–6.
Scott SD, Rotter T, Flynn R, Brooks H, Plesuk T, Bannar-Martin K, et al. Systematic review of the use of process evaluations in knowledge translation research. Syst Rev. 2019;8(266). https://doi.org/10.1186/s13643-019-1161-y.
Grant A, Treweek S, Dreischulte T, Foy R, Guthrie B. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials. 2013;14:15.
Haskell L, Tavender EJ, Wilson C, O'Brien S, Babl FE, Borland ML, et al. Implementing evidence-based practices in the care of infants with bronchiolitis in Australasian acute care settings: study protocol for a cluster randomised controlled trial. BMC Pediatr. 2018;18(1):218.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.
Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8(6):iii–v 1-72.
Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337(a1655):a1655.
Borrelli B. The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Public Health Dent. 2011;71(0022–4006 (Print)):S52–63.
Noell G, Gresham F, Gansle K. Does treatment integrity matter? A preliminary investigation of instructional implementation and mathematics performance. J Behav Educ. 2002;11(1):51–67.
Tyler A, Krack P, Bakel LA, O'Hara K, Scudamore D, Topoz I, et al. Interventions to reduce over-utilized tests and treatments in bronchiolitis. Pediatrics. 2018;141(6):06.
Meeker D, JA L, Fox C. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA. 2016;315(6):562–70.
Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochr Database Syst Rev. 2012;(6):CD000259.
Flodgren G, O'Brien MA, Parmelli E, Grimshaw JM. Local opinion leaders: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2019;6:CD000125.
Bonawitz K, Wetmore M, Heisler M, Dalton V, Damschroder LJ, Forman J, et al. Champions in context: which attributes matter for change efforts in healthcare? Implement Sci. 2020;15(62).
Stephens TJ, Peden CJ, Pearse RM, Shaw SE, Abbott TEF, Jones E, et al. Improving care at scale: process evaluation of a multi-component quality improvement intervention to reduce mortality after emergency abdominal surgery (EPOCH trial). Implement Sci. 2018;13(142).
Huis A, Holleman G, van Achterberg T, Grol R, Schoonhoven L, Hulscher M. Explaining the effects of two different strategies for promoting hand hygiene in hospital nurses: a process evaluation alongside a cluster randomised controlled trial. Implement Sci. 2013;8(41):41.
Wittmeier KD, Klassen TP, Sibley KM. Implementation science in pediatric health care: advances and opportunities. JAMA Pediatr. 2015;169(4):307–9.
This study was supported by a National Health and Medical Research Council (NHMRC) Centre of Research Excellence grant for Paediatric Emergency Medicine (GNT1058560), Australia; the Victorian Government’s Operational Infrastructure Support Programme, Australia; and the Health Research Council (HRC) of New Zealand (13/556), New Zealand. LH’s time was partially funded by a Clinical Research Training Fellowship from the HRC, (19/140), New Zealand. SRD’s time was partially supported by Cure Kids New Zealand. FEB’s time was partially funded by a grant from the Royal Children’s Hospital Foundation, Melbourne, Australia and NHMRC Practitioner Fellowship.
We would like to thank all the hospitals involved in this study: Australia: Austin Hospital, Victoria; Ballarat Base Hospital, Victoria; Caboolture Hospital, Queensland; Cairns Hospital, Queensland; Casey Hospital, Victoria; Gold Coast University Hospital, Queensland; Ipswich Hospital, Ipswich; Logan Hospital, Queensland; Lyell McEwin Hospital, South Australia; Queensland Children’s Hospital, Queensland; Royal Darwin Hospital, Northern Territory; Royal North Shore Hospital, New South Wales; Sydney Children’s Hospital, New South Wales; The Children’s Hospital at Westmead, New South Wales; The Northern Hospital, Victoria; The Prince Charles Hospital, Queensland; Toowoomba Base Hospital, Queensland; Townsville Hospital, Queensland; Wollongong Hospital, New South Wales; Woman and Children’s Hospital, South Australia. New Zealand: Christchurch Hospital, Christchurch; Hawkes Bay Hospital, Hastings; Rotorua Hospital, Rotorua; Southland Hospital, Invercargill; Tauranga Hospital, Tauranga; Waikato Hospital, Hamilton. We would like to acknowledge Xiaofang Wang for assistance with statistical analysis, from Murdoch Children’s Research Institute (MCRI), Melbourne, Australia.
Declaration of Helsinki
This study has been conducted in accordance with the principles of the Declaration of Helsinki.
Supported by a NHMRC Centre of Research Excellence grant for Paediatric Emergency Medicine (GNT1058560), Australia, and the Health Research Council of New Zealand (HRC 13/556).
Ethics approval and consent to participate
The study was approved by the Royal Children’s Hospital Human Research Ethics Committee (EC00238), Australia (HREC/16/RCHM/84), and the Northern A Health and Disability Ethics Committee, New Zealand (16/NTA/146). The ethics committees waived the requirement for informed patient consent because unidentified data from medical records was being collected.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Components and methods of process evaluation. Appendix Table 2 Bronchiolitis intervention detail based on Template for Intervention Description and Replication (TIDieR). Appendix Table 3 Clinical lead questionnaire.
About this article
Cite this article
Haskell, L., Tavender, E.J., O’Brien, S. et al. Process evaluation of a cluster randomised controlled trial to improve bronchiolitis management – a PREDICT mixed-methods study. BMC Health Serv Res 21, 1282 (2021). https://doi.org/10.1186/s12913-021-07279-2