Skip to main content

The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation

Abstract

Background

There has long been debate about the balance between fidelity to evidence-based interventions (EBIs) and the need for adaptation for specific contexts or particular patients. The debate is relevant to virtually all clinical areas. This paper synthesises arguments from both fidelity and adaptation perspectives to provide a comprehensive understanding of the challenges involved, and proposes a theoretical and practical approach for how fidelity and adaptation can optimally be managed.

Discussion

There are convincing arguments in support of both fidelity and adaptations, representing the perspectives of intervention developers and internal validity on the one hand and users and external validity on the other. Instead of characterizing fidelity and adaptation as mutually exclusive, we propose that they may better be conceptualized as complimentary, representing two synergistic perspectives that can increase the relevance of research, and provide a practical way to approach the goal of optimizing patient outcomes. The theoretical approach proposed, the “Value Equation,” provides a method for reconciling the fidelity and adaptation debate by putting it in relation to the value (V) that is produced. The equation involves three terms: intervention (IN), context (C), and implementation strategies (IS). Fidelity and adaptation determine how these terms are balanced and, in turn, the end product – the value it produces for patients, providers, organizations, and systems. The Value Equation summarizes three central propositions: 1) The end product of implementation efforts should emphasize overall value rather than only the intervention effects, 2) implementation strategies can be construed as a method to create fit between EBIs and context, and 3) transparency is vital; not only for the intervention but for all of the four terms of the equation.

Summary

There are merits to arguments for both fidelity and adaptation. We propose a theoretical approach, a Value Equation, to reconciling the fidelity and adaptation debate. Although there are complexities in the equation and the propositions, we suggest that the Value Equation be used in developing and testing hypotheses that can help implementation science move toward a more granular understanding of the roles of fidelity and adaptation in the implementation process, and ultimately sustainability of practices that provide value to stakeholders.

Peer Review reports

Background

Implementation science is defined as the scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organizational, or policy contexts [1]. The goal is to close the gap between what has been shown to be effective in rigorous trials (i.e., evidence-based interventions [EBIs], such as diagnostic tools and treatments) and what is done in clinical practice, so that patient and population health is improved.

If patients and ultimately populations are going to benefit from the best available evidence, fidelity (also denoted as adherence or treatment integrity), defined as the degree to which an intervention is carried out as it was described and originally tested and/or as the developer intended [2,3,4,5,6,7,8], is important in all steps of the research-to-practice pathway. During efficacy and effectiveness trials, when the main purpose is to separate effects of the intervention from other factors, high fidelity ensures that it is the intervention, not other factors, that produces the effect. In implementation studies, EBI fidelity is often a primary implementation outcome for determining if the methods used to promote uptake were successful [9]. In routine care, the degree to which EBIs are delivered as originally designed ultimately determines if patients and populations indeed receive the interventions that correspond with the best available evidence [2, 5, 10]. Overall, this makes understanding fidelity a central issue for implementation science.

However, throughout all the steps of the research-to-practice pathway, there are forces that pull away from fidelity. This has drawn increased attention to the role of adaptations, defined as the changes made to an intervention based on deliberate considerations to increase fit with patient or contextual factors at the system, organization, team, and individual clinician level [11,12,13,14,15]. Deliberate distinguishes adaptations from drift [16]. The central role of adaptations in implementation is increasingly being acknowledged, as evident from recent special issues [17] and themes in recent scientific meetings such as the US National Institutes of Health and Academy Health 9th Annual Conference on the Science of Dissemination and Implementation [10] and the 2018 Nordic Implementation Conference [18]. Yet, although there is global concern about fidelity-adaptation questions, the related conceptual and methodological issues are far from resolved.

Starting from routine care, there is ample evidence illustrating how adaptation of EBIs is the rule rather than the exception when used in real-world practice [12, 19, 20]. Influences on multiple levels, from the system, organization, provider, and patient, can all influence the degree to which EBIs might require adaptation [14, 21]. For example, reasons for adaptation can include system- and organization-level concerns including workforce readiness, organizational context, and the cost of purchasing EBIs from intervention developers and purveyors. In line with this, it has been noted that adaptability is an important factor that implementation strategies should address and that adaptation is likely to be needed to promote uptake [22]. This follows Everett Rogers’ seminal research stipulating that an innovation (e.g., an EBI) almost always will be reshaped to fit the organization or context in which it will be used [23]. In a concurrent development, research about cultural adaptations has also highlighted the need to tailor interventions based on the culture of the target populations [24], as well as the need to increase our understanding of cultural influences on implementation strategies and outcomes [25].

Recently, there has also been more discussion about the role that adaptations play earlier along the research-to-practice pathway [26, 27]. This includes questioning the assumption that fidelity automatically maximizes effectiveness [28]. It also includes showing that adaptation happens not only when EBIs are used in practice but also during trials, indicating that intervention researchers also needs to attend to issues related to fidelity and adaptation [27]. There have been a number of efforts to improve the reporting of fidelity and adaptation (e.g., [12, 13, 29,30,31]) (see also Roscoe et al., (2019) [32] for a comparison of four classification taxonomies), but to date, neither adaptations nor the intervention as planned (i.e., fidelity), are sufficiently described or documented in effectiveness trials [33,34,35,36]. This leaves a gap in understanding the full scope of the fidelity and adaptation dilemma earlier in the research-to-practice pathway. Thus, fidelity and adaptation are concepts that implementation science by necessity needs to acknowledge and address. Yet, this is prevented by the plethora of terms used, and the lack of clarity as to how the constructs can be conceptually organized. In Table 1, we propose a taxonomy that further refines the definitions of fidelity and adaptation, according to subcomponents and dimensions to which fidelity and adaptation can refer. This may aid in identifying relevant constructs for assessment and measurement.

Table 1 Definitions of subcomponents that represent dimensions suggested in the literature that fidelity and adaptation can refer toa

The issue of fidelity and adaptation has been controversial for decades (e.g., [45]). This debate deals with the longstanding tensions between achieving internal and external validity [46]. Whereas some scholars emphasize the importance of drawing valid conclusions about the effects of an intervention, thereby prioritizing internal validity, others highlight the need for interventions to fit and function in the daily operations of different systems and organizations, thus highlighting the virtue of external validity. However, there has been little theoretical development that can guide how fidelity and adaptations should be managed and documented across the research-to-practice pathway and how this is related to implementation science [21]. The debate and the research on fidelity and adaptations has been split and fragmented across multiple fields and journals representing various clinical fields, disciplines, and/or settings in which EBIs are implemented. With some noticeable exceptions (e.g., [13, 28]), the debate has taken place in parallel silos, and there is currently a lack of overview over the main arguments for fidelity on one side and adaptations on the other. This hampers a more comprehensive understanding needed to move toward a theoretical approach for how the fidelity and adaptation debate can be reconciled. This paper aims to synthesise the main arguments for fidelity and adaptation and, based on that, propose a theoretical and applied approach for how adaptation and fidelity can optimally be managed.

Five reasons fidelity is vital – and five reasons adaptations are also vital

As outlined above, the logic of the research-to-practice pathway stipulates that EBIs should be used as they were described and intended to be provided. This approach implies that fidelity to the intervention is central and any deviations problematic. However, there are also strong arguments for why adaptations are needed. Table 2 summarizes the more pervasive reasons and justifications found in the literature for fidelity and adaptations, respectively.

Table 2 Arguments for Fidelity and Adaptation

Discussion

There are valid and reasonable arguments in support of fidelity, and there are valid and reasonable arguments in support of adaptation. However, many of the arguments seem to be contradictory and sometimes mutually exclusive. Much of the debate over the years has taken one or the other position, but the possibility that adaptation and fidelity can coexist has also been raised. These suggestions note that they can co-exist as long as the EBI core components are adhered to (e.g., [11, 24, 30]), and, more recently, that adaptation can improve fidelity by ensuring adherence to the key principles or elements underlying the EBI [64, 65]. Recent advancements in the conceptualization, measurement and documentation of adaptations have moved the field forward by aiding the empirical exploration of the relationship between fidelity, adaptations and outcomes (e.g., [14, 29, 31]).

Yet, with some noticeable exceptions (e.g., Chambers and Norton’s “Adaptome” model [26]), there have been few attempts at making theoretical propositions that address how fidelity and adaptation can be reconciled. In the following, we deconstruct the arguments for fidelity and adaptation to get at underlying assumptions, and then make three propositions that reconcile fidelity and adaptation. The propositions and equation terms are summarized in the Value Equation, as shown in Table 3. The Value Equation states that the optimal value (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and the implementation strategies (IS). The Value Equation (V = IN * C * IS) terms are described in detail below.

Table 3 The Value Equation: V = IN * C * IS

Building the value equation

Table 3 summarizes elements of the Value Equation. Written as a simple mathematical equation, its starting point is an assumption that it is (only) the EBI that produces the effect:

$$ Intervention\ (IN)= Effect\ \left(E\ \right). $$

Implicit here is that by adhering to the intervention as it was designed, the 1) effect is maximized; 2) it is clear what is being delivered; 3) there is little unwanted EBI variation between organizations, professionals, and patients; and 4) it is possible to accumulate knowledge across studies. Nevertheless, as described previously, adaptation happens. Thus, there is a need to specify the intervention as the extent to which the intervention was carried out as it was described (fidelity) (INf), as well as fidelity-consistent (INfc) and fidelity-inconsistent (INfi) adaptations [39] (see Table 3).

As the EBI moves along the research-to-practice pathway, the influence of contextual factors is increasingly recognizable. Thus, a second term is added to the equation: Context (C).

$$ IN\ast C=E. $$

Because many implementations take place in complex systems including influences on system, organization, provider, and patient levels, context needs to be further specified. Thus, we suggest that context be delineated as system context (Cs), organizational context (Co), provider context (e.g., professional discipline, training, attitudes toward the intervention) (Cpr), and patient context (e.g., target group) (Cpt).

The Value Equation proposes that by acknowledging that context is indeed a term in the equation, the effects of intervention, by necessity, need to be understood in relation to the context in which it is implemented. For example, even in efficacy trials, there are contextual factors that will influence the outcome (e.g., highly trained staff delivering the intervention, urban settings). Thus, an EBI is not effective in isolation; it is more or less effective for a certain group, in certain settings, and under certain conditions. When the EBI is used beyond that, the context term changes, and so does the expected effect. High fidelity may increase effects in certain contexts, and adaptation in others. The optimal answers lie in the configuration of both terms in the equation.

Implementation strategies create intervention–context fit

Implementation strategies are systematic processes to adopt and integrate EBIs into clinical care [22]. Implementation strategies can be simple (e.g., clinical reminders) or complex and multicomponent (e.g., training + coaching + audit and feedback) and varies with EBIs and contexts. We build on this notion to derive our first proposition: that implementation strategies are ways to create fit (i.e., appropriateness [9]) between an intervention and a specific context. We add a third term to the equation: Implementation Strategy (IS).

$$ IN\ast C\ast IS=E $$

We argue that implementation strategies can optimize the effect of interventions in two ways: 1) by optimizing the outer system or inner organizational context so that it fits the intervention (ISc) [44], or 2) by optimizing the intervention so that it fits the context (ISi) (Table 3). Thus, in the first case, implementation strategies are concerned with increasing fidelity by enabling appropriate changes in the context (e.g., by increasing competence among staff and/or create opportunities for the target behaviours through environmental restructuring such as changing the reimbursement system to allow clinicians needed time, etc. [66, 67]). In the second case, the implementation strategies promote adaptations to achieve fit (e.g., remove components because they are perceived as culturally inappropriate, or tailor based on patient preferences [12, 68]).

This proposition builds on the first argument for adaptation, stating that intervention–context fit is a necessary condition for implementation, but also invokes Elliott and Mihalic’s (2004) [69] notion that the need for intervention–context fit does not necessarily mean adaptation of the intervention; it may as well mean adaptation of the context to facilitate fidelity to the intervention. Thus, we build on previous work suggesting that adaptation and fidelity can co-exist (e.g., [11, 24, 30]), and add to that by explicitly proposing implementation strategies as the activities that optimize fit and reconcile fidelity and adaptation, whether those are concerned with modifying the intervention or the context, or both intervention and context.

The proposition to view implementation strategies as ways to create fit between an intervention and a specific context opens up new innovative approaches to choosing and matching implementation strategies, which has proven to be challenging [70]. The proposition aligns with recent suggestions to use user-centred design principles and community-academic partnerships for the purpose of creating fit between interventions and context, by engaging intervention developers and/or implementers and practitioners in a collaborative redesign process [71,72,73]. The value equation can aid this process by explicating which strategies are used, and why (if it is for the purpose of achieving fit by changing the context, or the intervention), and to what effect.

Moving from effect to multilevel value proposition

A compelling argument for both fidelity and adaptation is the potential for increase in the effectiveness and public health impact of an intervention. Here, we make our second proposition by proposing an intentional shift from focusing on the effect of an intervention to focusing on the value (V) it creates, making a final adjustment to the equation by exchanging effect for value. Expressed mathematically, the complete Value Equation becomes the following:

$$ IN\ast C\ast IS=V $$

Value is broader than intervention effects alone. It reflects the optimization of a configuration of patient (Vpt), provider (Vpr), organization (Vo), and system (Vs) values and outcomes. Thus, value is a multicomponent, multilevel construct that represents the perceived or real benefit for each stakeholder and for stakeholders combined: a multilevel value proposition. For example, value for a service system may be increased population health, while for an organization, it may be optimized service delivery and decreased costs. Concurrently, a clinical professional may view value as being able to consider individual patient needs and outcomes, and patients may value their own improved functioning in daily life and/or clinical outcomes.

But what then is success of an EBI? By focusing on value, we suggest that implementation success can be defined as the ability to optimize value across the different levels and stakeholders. This perspective on implementation success aligns with recent definitions of sustainability, which highlight the ability to continuously deliver benefits as key part of the construct [74], with adaptations being a strategy to promote it [75]. The value equation proposes that effects on certain clinical outcomes are necessary but not sufficient. An EBI also needs to maximize value for individual providers, for the organization, and for the system. This shifts the focus on implementation from getting an EBI in place, to thinking about its value more broadly, and being more egalitarian in considering the needs of multiple stakeholders, including recently identified “bridging factors” to optimize implementation across context levels [76].

The equation, with its focus on value, also has implications for intervention developers. It implies that moving from designing interventions to maximize efficacy to designing interventions that maximize value, for multiple stake-holders. According to the Value Equation, the intervention that is most efficacious may not be the one that also provides the most value. A less complex intervention that can be delivered by less skilled staff and that requires less implementation resources (e.g., supervision, re-organization of care) may result in higher value than an intervention that stands little chance of being used in practice [26]. This is consistent with approaches to maximizing public health impact where a given EBI may have a smaller effect size, but if it is sustained and reaches more patients then even a small effect sizes can have significant public health impact [77].

It is in relation to the multidimensional value configuration that fidelity and adaptation should be considered. Sometimes, fidelity is a way to optimize value, sometimes it is adaptations, and often it is a combination. This also means that fidelity might optimize one outcome and adaptation another. Furthermore, the different types of outcomes may be valued differently by different stakeholders. In this, we acknowledge that different stakeholders’ definitions of value may differ. In fact, they may often be misaligned, such as when an organization is required by the system to provide a service to a sub-population that does not request it. We suggest that the better implementers are at acknowledging and addressing these value conflicts, the higher the likelihood for successful and sustained implementation. Community–academic partnerships may be one bridging factor that may facilitate this process [78] by engaging stakeholders in jointly considering system, organization, and patient needs, increasing their understanding of others agendas and encouraging a transparent negotiation of how to best address different needs. Techniques such as concept mapping and co-created program logic (COP) may be useful to promote an understanding of divergent viewpoints, and an effective dialogue [79, 80].

Similarly, by moving from focus only on treatment effect to a value configuration, we can reconcile arguments for fidelity and adaptation in relation to equity. We simply propose focusing on equity of the value achieved by the equation as a whole (i.e., for all stakeholders across levels) rather than only equity in relation to the intervention.

Transparency over all the value equation terms

One of the main arguments for fidelity is related to transparency: Fidelity to an EBI is needed for comparisons, accumulation of knowledge, and accountability. Our third proposition is that what is essential is transparent use. Thus, replication and accumulation of knowledge is still possible, but redefined to focus on transparency in relation to all terms in the Value Equation. In this, the Value Equation is consistent with recent calls for redefining replicability in clinical science (e.g., [81]). Requirements from funders to provide information on all equation terms would be helpful to push the development in this direction.

This proposition is consistent with calls for rigorous strategies to monitor, guide and evaluate fidelity [82,83,84] as well as adaptation, as increasingly has been acknowledged (e.g., [17, 26, 29, 31, 85, 86]). The Value Equation adds to this by proposing transparent reporting of all equation terms, and justification of fidelity and adaptation based on how it promotes fit between the EBI and context and in relation to how it impacts value. In this way, users will be supported in assessing INfi and INfc in subsequent implementations. Otherwise, the risk is what can be called “adaptation neglect,” a syndrome where adaptations pass unnoticed or undocumented regardless of how obvious they are.

Toward personalized value equations

One of the main argument for fidelity is to enable accumulation of knowledge through replication, putting focus on only one of the terms of the equation. The Value Equation and the transparency proposition instead focuses on all terms, thereby facilitating a gradual increase in the precision of the knowledge of what works for whom and when (i.e., specificity) [87]. This requires sophisticated processes and infrastructure. One way to achieve this may be to create databases of the different ways in which an intervention has been used, in what context, and to what effect [26, 86, 88]. Such data, thus, can form the basis for a gradual increased understanding of what creates value for whom and shows how the logic of the Value Equation can look in practice. For example, in the Paediatric Oncology Department of Karolinska University Hospital in Sweden, when a child does not respond as expected to a treatment protocol, adaptations are made, and both adaptations and effects are documented. Data from similar cases are accumulated, creating additional arms in the ongoing comparative trial. In this way, data on intervention*context configurations are collected.

Nevertheless, building databases that reflect the whole Value Equation may increase the administrative burden on clinical staff and organizations as a whole. A way to circumvent this risk may be to build a data infrastructure where all stakeholders involved in the healthcare process (patients, providers, organizations, and system) are invited to share and use data for their specific needs, so that those entering the data also benefit from it in their daily operations [89]. Although such a development may seem utopian in many fragmented systems, there are examples of these learning healthcare systems, for instance, at Cincinnati Children’s Hospital Medical Center [89] and in rheumatology care in Sweden [90]. Researchers may, for example, use the data for comparative effectiveness studies, and healthcare system representatives for benchmarking. However, the most transformative aspect may be when patients and providers can use the system at the point of care to track how an EBI is used (fidelity and adaptation) and what value it creates for the specific patient. This is consistent with recent applications of measurement-based care, where data related to intervention, context and implementation is assessed real-time along with clinical data to guide clinical decision making [91].

Used in this way, the learning healthcare system [92] will provide the most precise version of “what works for whom, when” we can think of: personalized value equations in patient- or provider-driven n = 1 studies [93]. Aggregation of all n = 1 studies will then provide the basis for accumulation of knowledge of “what works for whom, when,” thereby bridging personalized medicine and the ideas for systemizing knowledge about adaptations as outlined in the Adaptome [26].

Conclusions

In mathematics and statistics, we are used to thinking about how the different terms of an equation together determine the outcome. Implementation scientists can use the same approach to understand the product of an EBI, minding the context in which it is used and given the implementation strategies applied. The Value Equation is a theoretical proposition that reconciles the role of adaptation and fidelity in the research-to-practice pathway. The Value Equation states that the optimal value configuration of the intervention that can be obtained (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and how well the implementation strategy (IS) optimizes the intervention and the context. Fidelity and adaptation determine how these terms are mixed and, in turn, the end product: the value configuration it produces for multiple stakeholders.

The Value Equation contains three central propositions: 1) it positions implementation strategies as a way to create fit between EBIs and context, 2) it explicates that the product of implementation effort should move from emphasizing effects to emphasizing optimization of a multilevel value configuration, and 3) it shifts focus from fidelity to transparency over all terms of the equation. While there are many complexities in each of these propositions and in each of the terms in the equation, we suggest that the Value Equation be used to develop and test hypotheses that ultimately can help implementation science move toward a more granular understanding of how methods to promote the uptake of research findings can be optimized.

Availability of data and materials

NA

Abbreviations

EBI/EBIs:

Evidence-based intervention(s)

V:

Value

Vs:

value and fit of intervention in the system context

Vo:

Value and fit of intervention in organizational context

Vpr:

Value and fit of the intervention for the provider

Vpt:

Value and fit of the intervention for the patient

IN:

Intervention

INf:

Extent to which the intervention is carried out as it was described (fidelity)

INfc:

Fidelity-consistent adaptations

INfi:

Fidelity-inconsistent adaptations

C:

Context

Cs:

System context

Co:

Organizational context

Cpr:

Provider context (e.g., professional discipline, training, attitudes toward the EBI)

Cpt:

Patient context (e.g., target group)

IS:

Implementation Strategy

ISc:

Implementation strategy optimizing the context

ISi:

Implementation strategy optimizing the intervention

References

  1. Eccles MP, Mittman BS. Welcome to implementation science. Implenent Sci. 2006;1:1.

    Article  Google Scholar 

  2. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:1–9.

    Article  Google Scholar 

  3. Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing Treatment Fidelity in Health Behavior Change Studies: Best Practices and Recommendations From the NIH Behavior Change Consortium. Health Psychol. 2004;23:443–51.

    Article  PubMed  Google Scholar 

  4. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–56.

    Article  PubMed  Google Scholar 

  5. Hasson H. Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci. 2010;5:67.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Sechrest L, West SG, Phillips MA, Redner R, Yeaton W. Some neglected problems in evaluation research: Strength and integrity of treatments. Evaluation studies review annual. 1979;4:15–35.

    Google Scholar 

  7. Gearing RE, El-Bassel N, Ghesquiere A, Baldwin S, Gillies J, Ngeow E. major ingredients of fidelity: a review and scientific guide to improving quality of intervention research implementation. Clin psychol rev. 2011;31:79–88.

    Article  PubMed  Google Scholar 

  8. Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: analysis of the studies and examination of the associated factors. Journal of consulting and clinical psychology. 2007;75(6):829.

    Article  PubMed  Google Scholar 

  9. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.

    Article  PubMed  Google Scholar 

  10. Chambers D, Simpson L, Neta G, von Thiele Schwarz U, Percy-Laurry A, Aarons GA, et al., editors. Proceedings from the 9 th annual conference on the science of dissemination and implementation. Implement Sci; 2017; 12(Suppl1).

  11. Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41:290–303.

    Article  PubMed  Google Scholar 

  12. Moore J, Bumbarger B, Cooper B. Examining Adaptations of Evidence-Based Programs in Natural Contexts. J Prim Prev. 2013;34:147–61.

    Article  PubMed  Google Scholar 

  13. Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Card JJ, Solomon J, Cunningham SD. How to adapt effective programs for use in new contexts. Health promot pract. 2011;12:25–35.

    Article  PubMed  Google Scholar 

  16. Waller G. Evidence-based treatment and therapist drift. Behav Res Ther. 2009;47(2):119–27.

    Article  PubMed  Google Scholar 

  17. Bumbarger BK, Kerns SEU. Introduction to the Special Issue: Measurement and Monitoring Systems and Frameworks for Assessing Implementation and Adaptation of Prevention Programs. The Journal of Primary Prevention. 2019;40(1):1–4.

    Article  PubMed  Google Scholar 

  18. von Thiele Schwarz U, Hasson H, Aarons GA, Sundell K. Usefulness of evidence -Adaptation and adherence of evidence-based methods. Nordic Implementation Conference; 2018, May 29; Copenhagen, Denmark.

  19. Aarons GA, Miller EA, Green AE, Perrott JA, Bradway R. Adaptation happens: a qualitative case study of implementation of The Incredible Years evidence-based parent training programme in a residential substance abuse treatment programme. Journal of Children's Services. 2012;7(4):233–45.

    Article  Google Scholar 

  20. Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner CA. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: Methodologies, impact, and future directions. Clin Psychol: Science Practice. 2017;24(4):396–420.

    Google Scholar 

  21. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Rogers EM. Diffusion of innovations. New York: Simon and Schuster; 2010.

    Google Scholar 

  24. Castro FG, Barrera M Jr, Martinez CR Jr. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–5.

    Article  PubMed  Google Scholar 

  25. Cabassa L, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Chambers DA, Norton WE. The adaptome: advancing the science of intervention adaptation. Am J Prev Med. 2016;51:S124–31.

    Article  PubMed  PubMed Central  Google Scholar 

  27. von Thiele Schwarz U, Förberg U, Sundell K, Hasson H. Colliding ideals–an interview study of how intervention researchers address adherence and adaptations in replication studies. BMC Med Res Methodol. 2018;18:36.

  28. Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic. Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Front Public Health. 2018;6:102.

    PubMed  Google Scholar 

  30. Pérez D, Van der Stuyft P, del Carmen ZM, Castro M, Lefèvre P. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implement Sci. 2015;11:91.

    Article  Google Scholar 

  31. Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14:58.

    Article  Google Scholar 

  32. Roscoe JN, Shapiro VB, Whitaker K, Kim BE. Classifying changes to preventive interventions: applying adaptation taxonomies. J Prim Prev. 2019;40:89–109.

    Article  PubMed  Google Scholar 

  33. Hoffmann TC, Erueti C, Glasziou PP. Poor description of non-pharmacological interventions: analysis of consecutive sample of randomised trials. BMJ. 2013;347:f3755.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Leichsenring F, Steinert C, Ioannidis JP. Toward a paradigm shift in treatment and research of mental disorders. Psychol Med. 2019:1–7.

  35. Cox JR, Martinez RG, Southam-Gerow MA. Treatment integrity in psychotherapy research and implications for the delivery of quality mental health services. J Consult Clin Psych. 2019;87:221.

    Article  Google Scholar 

  36. Glasziou P, Meats E, Heneghan C, Shepperd S. What is missing from descriptions of treatment in trials and reviews? BMJ. 2008;336:1472.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18:23–45.

    Article  CAS  PubMed  Google Scholar 

  38. Yeaton WH, Sechrest L. Critical dimensions in the choice and maintenance of successful treatments: strength, integrity, and effectiveness. J Consulting Clin Psychol. 1981;49:156.

    Article  CAS  Google Scholar 

  39. Stirman SW, Gutner C, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Scie. 2015;10(1):115.

    Article  Google Scholar 

  40. Resnicow K, Soler R, Braithwaite RL, Ahluwalia JS, Butler J. Cultural sensitivity in substance use prevention. J Community Psychol. 2000;28:271–90.

    Article  Google Scholar 

  41. Hawe P. Lessons from complex interventions to improve health. Ann Rev Public Health. 2015;36:307–23.

    Article  Google Scholar 

  42. Steckler AB, Linnan L, Israel B. Process evaluation for public health interventions and research. San Francisco, California: Jossey-Bass; 2002.

    Google Scholar 

  43. von Thiele Schwarz U, Hasson H, Lindfors P. Applying a fidelity framework to understand adaptations in an occupational health intervention. Work. 2015;51:195–203.

    Article  PubMed  Google Scholar 

  44. Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Castro FG, Yasui M. Advances in EBI development for diverse populations: Towards a science of intervention adaptation. Prev Sci. 2017;18:623–9.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Cook TD, Campbell DT, Shadish W. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin; 2002.

    Google Scholar 

  47. Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health. 2011;38:32–43.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Scanlon JW, Horst P, Nay JN, Schmidt RE, Waller A. Evaluability assessment: Avoiding type III and IV errors. Evaluation management. 1977:71–90.

  49. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research issues in external validation and translation methodology. Evaluation and the Health Professions. 2006;29:126–53.

    Article  PubMed  Google Scholar 

  50. Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.

    Article  PubMed  Google Scholar 

  51. Schmidt S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Rev Gen Psychol. 2009;13:90.

    Google Scholar 

  52. Escoffery C, Lebow-Skelley E, Haardoerfer R, Boing E, Udelson H, Wood R, et al. A systematic review of adaptations of evidence-based public health interventions globally. Implement Sci. 2018;13:125.

    Article  PubMed  PubMed Central  Google Scholar 

  53. von Thiele Schwarz U, Lundmark R, Hasson H. The dynamic integrated evaluation model (DIEM): achieving sustainability in organizational intervention through a participatory evaluation approach. Stress Health. 2016;32(4):285–93.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Shediac-Rizkallah M, Bone L. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13:87–108.

    Article  CAS  PubMed  Google Scholar 

  55. Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS, Roitman DB, et al. The fidelity-adaptation debate: Implications for the implementation of public sector social programs. Am J Community Psychol. 1987;15:253–68.

    Article  Google Scholar 

  56. Hansen WB, Graham JW, Wolkenstein BH, Rohrbach LA. Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. J Stud Alcohol. 1991;52:568–79.

    Article  CAS  PubMed  Google Scholar 

  57. Becker SJ, Tanzman B, Drake RE, Tremblay T. Fidelity of supported employment programs and employment outcomes. Psychiatr Serv. 2001;52(834).

    Article  CAS  PubMed  Google Scholar 

  58. Fauskanger Bjaastad J, Henningsen Wergeland GJ, Mowatt Haugland BS, Gjestad R, Havik OE, Heiervang ER, et al. Do clinical experience, formal cognitive behavioural therapy training, adherence, and competence predict outcome in cognitive behavioural therapy for anxiety disorders in youth? Clin psychol psychother. 2018;25:865–77.

    Article  PubMed  Google Scholar 

  59. Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. Novel Programs, International Adoptions, or Contextual Adaptations? Meta-Analytical Results From German and Swedish Intervention Research. J Clin Child Adoles Psychol. 2015:1–13.

  60. Bond GR, Becker DR, Drake RE. Measurement of fidelity of implementation of evidence-based practices: Case example of the IPS Fidelity Scale. Clin Psychol: Science Practice. 2011;18:126–41.

    Google Scholar 

  61. Tinetti ME, Fried TR, Boyd CM. Designing health care for the most common chronic condition—multimorbidity. JAMA. 2012;307:2493–4.

    CAS  PubMed  PubMed Central  Google Scholar 

  62. Tonelli M. The philosophical limits of evidence-based medicine. Acad Med. 1998;73:1234–40.

    Article  CAS  PubMed  Google Scholar 

  63. Joyner MJ, Paneth N. Seven questions for personalized medicine. JAMA. 2015;314:999–1000.

    Article  CAS  PubMed  Google Scholar 

  64. Anyon Y, Roscoe J, Bender K, Kennedy H, Dechants J, Begun S, et al. Reconciling Adaptation and Fidelity: Implications for Scaling Up High Quality Youth Programs. The journal of primary prevention. 2019;40(1):35–49.

    Article  PubMed  Google Scholar 

  65. Marques L, Valentine SE, Kaysen D, Mackintosh M-A, De Silva D, Louise E, et al. Provider fidelity and modifications to cognitive processing therapy in a diverse community health clinic: Associations with clinical change. J Consult Clin Psych. 2019;87(4):357.

    Article  Google Scholar 

  66. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Beh Health Ser & Research. 2017;44:177–94.

    Article  Google Scholar 

  67. Michie S, Atkins L, West R. The behaviour change wheel. A guide to designing interventions. Great Britain: Silverback Publishing; 2014.

    Google Scholar 

  68. Kakeeto M, Lundmark R, Hasson H, von Thiele Schwarz U. Meeting patient needs trumps adherence. A cross-sectional study of adherence and adaptations when national guidelines are used in practice. J Eval Clinic Practice. 2017;23:830–8.

    Article  Google Scholar 

  69. Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5:47–53.

    Article  PubMed  Google Scholar 

  70. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Lyon AR, Bruns EJ. User-Centered Redesign of Evidence-Based Psychosocial Interventions to Enhance Implementation—Hospitable Soil or Better Seeds? JAMA Psych. 2019;76:3–4.

    Article  Google Scholar 

  72. Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, et al. Community-academic partnerships: A systematic review of the state of the literature and recommendations for future research. The Milbank Quarterly. 2016;94(1):163–214.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Hasson H, Gröndal H, Hedberg Rundgren Å, Avby G, Uvhagen H, Von Thiele Schwarz U. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: A study protocol. Implement Sci Communications. In press.

  74. Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12:110.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Stirman SW, Finley EP, Shields N, Cook J, Haine-Schlagel R, Burgess JF, et al. Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial. Implement Sci. 2017;12(1):32.

    Article  Google Scholar 

  76. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Rutledge T, Loh C. Effect sizes and statistical testing in the determination of clinical significance in behavioral medicine research. Ann Behav Med. 2004;27:138–45.

    Article  PubMed  Google Scholar 

  78. Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adoles Psychol. 2014;43:915–28.

    Article  Google Scholar 

  79. Green AE, Fettes DL, Aarons GA. A concept mapping approach to guide and understand dissemination and implementation. J Behav Health Ser and Research. 2012;362–73.

    Article  Google Scholar 

  80. von Thiele Schwarz U, Richter A, Hasson H. Getting everyone on the same page: Cocreated program logic (COP). In: Nielsen K, Noblet A, editors. Organizational Interventions for Health and Well-being: Taylor and Francis; 2018. p. 58–83.

    Chapter  Google Scholar 

  81. Tackett JL, Lilienfeld SO, Patrick CJ, Johnson SL, Krueger RF, Miller JD, et al. It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Persp Psychol Sci. 2017;12:742–56.

    Article  Google Scholar 

  82. Des Jarlais D, Lyles C, Crepaz N, Group T. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health. 2004;94:361–6.

    Article  PubMed  PubMed Central  Google Scholar 

  83. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMC Med. 2010;8:18.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med. 2007;4(e296).

    Article  PubMed  PubMed Central  Google Scholar 

  85. Lewis CC, Lyon AR, McBain SA, Landes SJ. Testing and Exploring the Limits of Traditional Notions of Fidelity and Adaptation in Implementation of Preventive Interventions. J Prim Prev. 2019;40:137–41.

    Article  PubMed  Google Scholar 

  86. DeRosier ME. Three Critical Elements for Real-Time Monitoring of Implementation and Adaptation of Prevention Programs. J Prim Prev. 2019;40:129–35.

    Article  PubMed  PubMed Central  Google Scholar 

  87. Pawson R. The science of evaluation: A realist manifesto: Sage; 2013.

    Book  Google Scholar 

  88. Berkel C, Gallo CG, Sandler IN, Mauricio AM, Smith JD, Brown CH. Redesigning Implementation Measurement for Monitoring and Quality Improvement in Community Delivery Settings. J Prim Prev. 2019;40:111–27.

    Article  PubMed  PubMed Central  Google Scholar 

  89. Lindblad S, Ernestam S, Van Citters A, Lind C, Morgan T, Nelson E. Creating a culture of health: evolving healthcare systems and patient engagement. QJM: Int J Med. 2017;110:125–9.

    CAS  Google Scholar 

  90. Ovretveit J, Keller C, Forsberg HH, Essén A, Lindblad S, Brommels M. Continuous innovation: developing and using a clinical database with new technology for patient-centred care—the case of the Swedish quality register for arthritis. Int J Qual Health Care. 2013;25:118–24.

    Article  PubMed  Google Scholar 

  91. Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. 2015;22:49–59.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Atkins D, Kilbourne AM, Shulkin D. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Ann Review Publ Health. 2017;38:467–87.

    Article  Google Scholar 

  93. Riggare S, Unruh KT, Sturr J, Domingos J, Stamford JA, Svenningsson P, et al. Patient-driven N-of-1 in Parkinson’s Disease. Methods information med. 2017;56:e123–e8.

    Article  Google Scholar 

Download references

Funding

This work was funded by a research grant from the Swedish Research Council (project no. 2016–01261) and a visiting researcher grant from FORTE (DELG-2017/0024) after competitive peer-review processes. GAA was supported in part by the US National Institute on Drug Abuse (Grant # R01DA038466) and the National Institute of Mental Health (Grant # R01MH072961 and R03MH117493). The funders had no role in the design and conduct of the study or in the writing of the manuscript. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Swedish Research Council, Forte, or the US National Institutes of Health.

Author information

Authors and Affiliations

Authors

Contributions

UvTS and HH conducted the literature review and created the first version of the Value Equation, which was further refined with GAA. All authors contributed significantly to development of the paper. All authors read and approved the final version.

Corresponding author

Correspondence to Ulrica von Thiele Schwarz.

Ethics declarations

Ethics approval and consent to participate

NA

Consent for publication

NA

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

von Thiele Schwarz, U., Aarons, G.A. & Hasson, H. The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation. BMC Health Serv Res 19, 868 (2019). https://doi.org/10.1186/s12913-019-4668-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-019-4668-y

Keywords