Skip to main content

Influence of participation in a quality improvement collaborative on staff perceptions of organizational sustainability



Sustainability capacity (SC), which is an organization’s ability to implement and maintain change, is influenced by internal attributes, environmental contextual influencers, and intervention attributes. Temporal changes in staff SC perceptions, as well as the influence of quality improvement collaborative (QIC) participation, has generally not been explored. This project addresses this gap, measuring staff SC perceptions at four time points (baseline and every 9 months) for clinics participating in an intervention – the Network for the Improvement of Addiction Treatment QIC initiative (called NIATx200).


A mixed linear model repeated measures analysis was applied to matched staff members (n = 908, representing 2329 total cases) across the evaluation timeframe. Three separate statistical models assessed potential predictors of SC perceptions: Time (Models I-III); NIATx200 intervention, staff job function, and tenure (Models II &III); and NIATx200 participation hours and four organizational variables (Model III).


For Model I, staff perceptions of total SC increased throughout most of the study (t1,4 = − 6.74, p < .0001; t2,4 = − 3.100, p < .036; t3,4 = − 0.23, p = ns). Model II did not change Model I’s overall Time effect, but combined NIATx200 services (t = − 2.23, p = .026), staff job function (t = − 3.27, p = .001), and organizational administrators (t = − 3.50, p = .001) were also significantly associated with greater perceptions of total SC. Inclusion of additional variables in Model III demonstrated the importance of a higher participation level (t = − 3.09, p < .002) and being in a free-standing clinic (t = − 2.06, p < .04) on staff perceptions of total SC.


Although staff exposure to sustainability principals was minimal in NIATx200, staff perceptions about their organization’s SC significantly differed over time. However, an organization’s participation level in a QIC became the principal predictor of staff SC perceptions, regardless of other factors’ influence. Given these findings, it is possible to develop and introduce specific sustainability content within the structure of a QIC to assess the impact on staff SC perceptions over time and the sustainment of organizational change.

Trial registration, NCT00934141. Registered July 6, 2009. Retrospectively registered.

Peer Review reports


An organization’s sustainability capacity (SC) represents its ability to implement and maintain the benefits of a systems change over time [1]. Substance use treatment clinics provide services to individuals with an opioid use disorder or alcohol use disorder, including clinical counseling and access to medications, and many of these individuals have co-occurring mental health disorders [2, 3]. These clinics face unique challenges when trying to implement and sustain changes, such as disengaged staff, lack of organizational capacity to sustain change, unease with making changes, or administrative process barriers such as multiple phone calls to schedule an appointment [4,5,6,7]. It is important, therefore, to understand how substance use clinic staff perceive the likelihood that changes within their organization will be sustained.

Sustainability frameworks suggest that an organizations’ capacity to sustain change is influenced by multilevel factors or constructs related to organizational attributes, environmental contextual features, and intervention characteristics [1, 8,9,10]. Examples of organizational attributes include leadership support, champion roles, revised policies and procedures or expert coaching support; external contextual features relate to regulatory or financial changes; and innovation attributes focus on ease of use or understanding how likely the benefits of the change would be sustained. Multiple studies have shown that various organizational, external, and innovation attributes inform the likelihood of sustaining an evidence-based practice within an organization [11,12,13,14,15,16,17,18,19]. Efforts to sustain change within an organization are not only a function of its SC but also depends on staff involvement. Although leadership support and the role of a champion are factors or constructs within these frameworks, less emphasis is placed on the role of staff involvement in implementing and sustaining change. As such, little is known about how participation in change efforts influence staff perceptions of an organization’s SC changes over time.

Sustainability capacity

Recent efforts sought to define and refine the classification of sustainability factors or constructs. The Integrated Sustainability Framework (ISF) identified 36 factors across multiple settings (e.g., community, school, clinical/social services) as being associated with sustainability [20]. These factors were grouped into five contexts: (a) outer context, (b) inner context, (c) intervention characteristics, (d) processes, and (e) implementer and population characteristics. Example ISF factors within each context include sociopolitical context and funding environment (outer context); funding/resources, staffing and turnover (inner context); adaptability, fit with population or context and benefits/need (intervention); partnership/engagement and program evaluation (process); and implementer motivation and attitudes (implementer/population) [20].

Alternatively, the Consolidated Framework for Sustainability Constructs (CFSC) conceptualized 40 constructs across six themes associated with sustainability of change in healthcare settings [21]. Themes include: (a) initiative design and delivery, (b) negotiations related to the initiative processes, (c) organizational setting, (d) people or individuals involved, (e) resources, and (f) external environment. The CFSC also explored approaches [retrospective (after the implementation has occurred) versus prospective (explored throughout implementation)] for assessing staff perceptions about sustainability and the level of focus [organizational (e.g., substance use provider) versus intervention (e.g., a single improvement project)] associated with the assessment of sustainability capacity [21]. The efforts resulted in the identification of the ten most prevalent sustainability constructs within each category (Additional File 1) [21]. Four constructs are common across both level of focus and assessment timing. These include: demonstrating effectiveness and monitoring progress over time (initiative design and delivery), leadership and champions (people involved), and general resources to support sustainability (resources) [21]. Other CFSC constructs varied according to whether they should be assessed at the organizational or intervention level. For example, training and capacity building, and integration with existing programs and policies were not typically assessed within an organizational level of focus; staff perceptions about the belief in the initiative is not assessed for the intervention level of focus and stakeholder participation in the retrospective approach [21].

Staff perceptions of sustainability capacity

Conceptually, the ISF and CFSC frameworks treat sustainability capacity as a process by which certain organizational attributes such as leadership support or staff involvement influence how change is sustained in an organization. Despite extensive research on sustainability constructs and associated frameworks, few instruments have been developed and extensively utilized in research to quantitatively assess staff perceptions about organizational sustainability capacity associated with the constructs in the ISF or CFSC [22,23,24]. These constructs represent elements often included in the structure of a quality improvement collaborative (QIC), such as understanding how implementation support and improvement methods might interact with stakeholder participation in training and capacity building activities to influence staff perceptions about an organization’s ability to sustain change. Although the Normalization Process Theory established a framework for evaluating staff perceptions and participation [25], research has not, to date, explored staff perceptions about SC as an outcome measure and how participation in a QIC influences those changes. Further, sustainability instruments have not been utilized to prospectively assess how staff perceptions about sustainability change over time while participating in a QIC.

Our objective in this manuscript was to explore temporal changes in staff SC perceptions for individuals working in substance use providers, as measured by the British National Health Services Sustainability Index (BNHS-SI), and how participation in a Network for the Improvement of Addiction Treatment (NIATx) quality improvement collaborative (QIC), called NIATx200, influenced those changes. Data collected during the NIATx200 initiative was utilized to begin addressing this important implementation research issue. This analysis builds on a prior analysis [26] and seeks to answer the research question, “What staff and organizational characteristics predict sustainability levels across the study timeframe?” guided study design and sample selection. The specific aims of the current paper were to: (1) explore temporal changes in staff perceptions about sustainability and (2) assess how staff and organizational characteristics as well as organizational participation in their assigned implementation strategy within a QIC influence changes in sustainability over time.


Study setting: NIATx200

The NIATx200 initiative built on prior successful NIATx research [4, 27,28,29,30]. NIATx200 evaluated the effectiveness of implementation strategies commonly used a QIC. To achieve this objective, NIATx200 recruited 201 addiction treatment clinics in five states (Massachusetts, Michigan, New York, Oregon, and Washington). Clinic eligibility criteria included: 60+ admissions per year, outpatient or intensive outpatient levels of care as defined by the American Society of Addiction Medicine (ASAM); and received some public funding in the past year [31]. Clinics, randomized within states, were stratified by size (number of patients per year) and management score [32] and assigned to one of four implementation strategies: (1) interest circle calls (n = 49), (2) learning sessions (n = 54), (3) coaching (n = 50), or (4) a combination of all three implementation strategies (n = 48). The NIATx200 initiative consisted an 18-month active implementation timeframe. During three distinct implementation periods lasting 6 months, participating clinics implemented organizational changes designed to improve wait time (mean days between first contact and first treatment), retention in treatment (percent of patients retained from first to fourth treatment session), and annual admissions. Data was also collected at the staff level about their perceptions associated with organizational readiness for change and sustainability propensity. The structure of the NIATx200 initiative and the description of the implementation strategies are described in more detail elsewhere [31, 33, 34].

Mixed-effect regression models determined which implementation strategy was most effective in improving outcomes, as well as being most cost-effective [31]. Improvements in the wait time and admission outcomes for clinics assigned to the coaching and combination strategies significantly differed from clinics assigned to the interest circle strategy and the coaching strategy was the more cost-effective as compared to interest circles [34]. Although no NIATx implementation strategy significantly improved treatment retention (as defined), an exploratory analysis, accounted for early treatment drop-off (i.e., a client not making it to the first treatment session) when measuring retention, showed clinic-level improvements for providers assigned to the coaching, combination and learning session implementation strategies which suggest that how retention was defined impacted the findings [34]. Results from this exploratory analysis clearly indicated that clinic participation in the three intervention (i.e., learning sessions, coaching, and the combination arm) improved the outcomes, with coaching being the most cost-effective strategy.

Although differences in clinic attributes did not affect improvements in the outcomes examined in other studies, organizational characteristics were included in these secondary data analyses. Organizational characteristics comprised: (1) non-profit status, (2) whether the clinic was free-standing Alcohol and Drug Abuse Treatment Program or part of a healthcare system, (3) whether the clinic had received accreditation from a national organization such as the Joint Commission on Accreditation of Healthcare Organizations or the Commission on Accreditation of Rehabilitation Facilities, and (4) the metropolitan statistical area (rural or urban status).

Implementation strategies

The structure of the four NIATx200 implementation strategies represented clinic participation levels. Interest circles involved monthly multi-clinic teleconferences for a total of 18 direct contact hours (18 calls, each one hour in length), and allowed change teams from participating clinics to receive advice from peers and learn new skills. The learning session strategy consisted of three face-to-face multi-day sessions held approximately every six months, which were led by a core faculty team and utilized a common curriculum to offer didactic and experiential learning opportunities. The first learning session consisted of 8.5 h of content delivered over a single day, while another 13 content hours were delivered over 1.5 days during each of the second and third learning sessions, resulting in a potential for 34.5 total direct contact hours. Clinics assigned to the coaching strategy received a one-day, 4-h, site visit, as well as participated in monthly one-hour coaching calls; 22 direct contact hours were possible for the coaching strategy. On the calls, the coach and change leader, executive sponsor, and change team reviewed the impact of organizational changes to improve the study outcomes, discussed successes, and identified ideas for future change projects. The combination strategy involved the interest circle calls, coaching, and learning sessions, and consisted of a cumulative possibility of 74.5 direct contact hours. NIATx200 results indicated that clinics assigned to interventions with higher participation hours, where interest circles were the referenced intervention, showed greater improvements in wait time and admissions. As such, staff in the clinics assigned to the interventions with more opportunities to participate in the intervention and be exposed to sustainability concepts would have higher perceptions about the likelihood that changes would be sustained.

Outcomes and measurement

The NIATx200 initiative utilized the British National Health Services Sustainability Index (BNHS-SI) to assess staff perceptions about the likelihood that a change will be sustained in the organization [23, 24]. The BNHS-SI has been utilized across multiple healthcare settings to assess staff perceptions about the sustainability of an organizational change [24, 35,36,37,38,39,40,41,42,43,44,45] and as a qualitative framework to qualitatively identify factors associated with the concept of sustainability [44, 46,47,48].

The tool (see Additional File 2 for questions) consists of 10 factors designed to assess overall staff perceptions about sustainability as well as their perceptions across three domains:

  1. (1)

    Process– benefits beyond helping patients, credibility of the benefits, adaptability of the improved process, and effectiveness of systems to monitor progress.

  2. (2)

    Staff– staff involvement and training to sustain the process, staff attitudes toward sustaining the change, senior leadership engagement, and clinical leadership engagement.

  3. (3)

    Organization– fit with organization’s strategic aims and culture, and infrastructure for sustainability.

The BNHS-SI utilizes an additive, multi-attribute, utility model to summarize the scores across the three domains (see Additional File 3) which are then totaled to arrive at an overall organization sustainability propensity score [24].

In the NIATx200 initiative, a staff sustainability survey [31] was developed and distributed at baseline and at every subsequent 9-month period (see Fig. 1) to prospectively assess clinic staff perceptions about sustainability capacity. The BNHS-SI measures the likelihood that a change will be sustained; therefore, it does not rely on a set sustainability definition (e.g., clinic continued to maintain the intervention after funding ended) when asking staff to complete the instrument. Instead, survey instructions stated that the BNHS-SI was “designed to gauge your organization’s propensity for sustaining changes”. As such staff were asked to “think about one specific change implemented as part of the NIATx200 project”; and then select one of four options for each of the 10 factors that best describes sustainability in their organization. Our team utilized a similar approach when assessing sustainability capacity within the Veterans Administration [38,39,40].

Fig. 1
figure 1

NIATx 200 Study Timeline

For this analysis, the cumulative extent of staff beliefs that changes implemented as part of the NIATx200 initiative would be sustained (called the Total Sustainability Score) was the primary outcome. Three secondary outcomes also were evaluated – representing scores from the process, staff, and organization domains from the BNHS-SI tool (called Process, Staff, and Organization Domain Scores, respectively).

Data collection

Staff were invited to complete a paper survey or use a link in the invitation letter to complete the survey online. The survey also collected staff demographic information related to job function, employment status, and tenure within the organization. Two additional questions (i.e., “What is the first initial of your mother’s maiden name?” and “On what day of the month is your birthday?”) were combined with staff demographic characteristics and the clinic ID to create a unique identifier for individual staff members that allowed matching of individual survey responses to be tracked over time.

Clinic participation (direct contact hours) in the assigned implementation strategy and the number of persons from the clinic participating in the assigned implementation strategy were recorded in real time by NIATx200 research staff and coaches.

Design and sample

The unique identifier was utilized to match individual survey responses across the four different time points (Fig. 1). As a result, all analyses are based on responses from the same staff members (n = 908, representing 2329 total cases) across the evaluation timeframe.

An important variable for this analysis is each clinic’s cumulative level of staff participation in QIC activities throughout the 27-month intervention interval (at baseline and approximately every nine months) (called Total Participation). Participation in each of the four study interventions was measured separately but, for the purpose and this study, was aggregated into a Total Participation metric. This variable is used to determine the influence of the number of encounters with the implementation strategy during the 27-month period. Although this factor does not represent the total number of staff who took part in each activity, and therefore reflects a clinic-level influence, it remains dependent on overall staff involvement. As such, degree of staff participation is considered appropriate and relevant, and is retained for this sample.


Analysis comprised both simple descriptive statistics and multivariate model building. Descriptive statistics were calculated for all variables used for this study. The type of descriptive values depended on whether the variables were continuous or categorical. For continuous variables, the mean, standard deviation (SD), and min/max are reported, while the frequencies of each category are provided for the categorical variables. Bivariate analyses were conducted on the primary outcome measure (Total Sustainability Score) and each anticipated study variable before entry into model; all variables used in the model demonstrated a significant independent association with the sustainability total.

The multivariate method was a linear mixed model repeated measures analysis that fit three separate statistical models to assess potential predictors of staff-level Total Sustainability Score, as well as on Process, Staff and Organization Domain Scores. For each model, a Repeated Covariance Type – A1(1): Heterogeneous – was used, which assumes different variances at each measurement time as well as correlations across time points that become weaker over those successive assessment times. All variables were entered into the models as fixed effects, and a maximum likelihood method was used to estimate the variable parameters. The statistical models are as follows:

  1. (1)

    Model I – containing only the variable representing the four time points during the NIATx200 initiative (Time),

  2. (2)

    Model II – containing Time, plus NIATx200-provided Implementation Strategies (i.e., learning sessions, interest circle calls, coaching sessions, or service combinations) and Job Function (i.e., administrative vs. clinical), and

  3. (3)

    Model III – containing the variables from Models I and II, plus organizational characteristics and the cumulative extent of participation in NIATx200-provided strategies (i.e., total number of hours).

IBM SPSSv26® was used to calculate all descriptive statistics and to estimate each model by calculating the parameter estimates for fixed effects at 95% confidence intervals. This study is reported in full accordance with the StaRI checklist [Additional File 4] [49].


Descriptive variables

The final sample size represents responses from the same staff members (n = 908, representing 2329 total cases) across the evaluation timeframe. The Total Participation in the NIATx200 implementation strategies represented the only continuous independent variable used in the models and ranged from no time to 64 h (mean = 26.03, SD = 18.23). Table 1 lists the category frequencies for the remaining model variables, as well as the mean Total Sustainability Scores associated with each variable category.

Table 1 Frequencies of Categorical Variables and Sustainability Scores Per Category

Repeated measures model I

Primary outcome

Model I (see Table 2) yielded strong overall predictive significance for Time (F = 7.270, p < .0001), with staff perceptions about overall sustainability capacity increasing throughout most of the study.

Table 2 Staff -Level Sustainability Perceptions: Linear Mixed Model Results for Model 1

Secondary outcomes

The time effect pattern for the primary outcome was also identified for the process and organizational domains. However, Staff Domain Scores evidenced a statistically significant increase only when comparing the study endpoints.

Repeated measures model II

Primary outcome

Table 3 shows that Model II did not change the overall Time effect profile identified in Model I for overall staff perceptions about sustainability capacity. However, the assigned NIATx200 strategy and staff job function were significant – participation in combined services (compared to learning sessions only) and organization administrators were associated with greater perceptions about sustainability propensity.

Table 3 Staff -Level Sustainability Perceptions: Linear Mixed Model Results for Model 2

Secondary outcomes

While the overall Time effect did not differ between Model II and Model I for the Staff Domain Score, the assigned NIATx200 strategy comparing combined services to learning sessions only and staff job function were significantly associated with perceptions about staff involvement. The Time effect for the Organization Domain Score did not change from Model I to Model II, and only the assigned NIATx200 strategy was significantly associated with perceptions related to organizational capacity to support sustainability. For the Process Domain Score, the pattern of effect for time changed somewhat between Model II and Model I, and only staff job function was significant.

Repeated measures model III

Primary outcome

The addition of Total Participation levels and organizational characteristics in Model III demonstrated the importance of being in a free-standing agency and having a higher participation, as measured by direct exposure hours, on overall beliefs about sustainability, but changed the statistical profiles for other model variables (see Table 4). For example, including Total Participation resulted in staff perceptions about sustainability being statistically significant only when comparing the two study endpoints. In addition, higher Total Sustainability Scores shifted to other implementation strategies (i.e.., coaching). Finally, administrators continued to report a greater sustainability propensity than clinicians.

Table 4 Staff -Level Sustainability Perceptions: Linear Mixed Model Results for Model 3

Secondary outcomes

The findings for the Staff Domain Score were similar to the Model III results for overall sustainability capacity. Total Participation hours strongly influenced staff perceptions about the Process and Organization Domain Scores. Participation in the coaching implementation strategy, staff who were administrators, and working in a for-profit, free-standing, facility was associated with greater sustainability propensity for the process domain. For the Organizational Domain Score, the time effect increased throughout most of the study and assignment to the coaching strategy was significant, as was being involved in a free-standing facility or an agency located in a rural setting.


Our study explored how the extent of participation in a QIC, based on implementation strategy assignment, was associated with staff perceptions about sustainability. The study was framed in the context of two conceptual sustainability frameworks (i.e., ISF and CFSC) with clear operational definitions using a rigorous outcome measure of sustainability capacity – the BNHS-SI. To the best of our knowledge, this study was the first to track how staff perceptions about sustainability changed longitudinally. We accomplished this by using data collected over a 27-month period from a convenience sample of responses from the same provider staff members participating in a QIC (NIATx200). In addition, the study is the first to measure the number of hours of provider participation in the assigned implementation strategy of the QIC and utilize the level of participation as a predictor of how staff perceptions about sustainability change over time.

Although NIATx200 offered minimal exposure to sustainability principals, staff perceptions about their organization’s overall SC continued throughout the QIC implementation period, with the most significant improvement occurring over the first 9-month period. However, there was a clear indication of a saturation effect for perceptions of improved sustainability, with subsequent changes in overall sustainability scores eventually showing no noticeable differences as time progressed. Similar patterns occurred for the process, staff, and organizational domains. These findings support a need to continue reinforcing the importance of sustainability within organizations, especially later during the implementation process. An evaluation of long-term sustainment in the NIATx200 initiative found that between 27 to 40% of participating clinics sustained changes for one of the three study outcomes but only 12% of the clinics sustained changes for two of the three study outcomes [45]. In some instances, improvement and subsequent sustainment occurred after the end of the active NIATx200 implementation period.

It was additionally noteworthy that there were noticeable staff differences in perceptions about whether change will be sustained within organizations. Administrators and managers were much more likely to anticipate a propensity for enhanced sustainability than were clinicians within the organization. These differences are more prominent for attributes associated with the process and staffing domains of the BNHS-SI. As a result, the implementation process requires further effort to engage clinicians to convince them that effective change is beneficial and can be both attained and sustained. Such efforts could be support through sustainability specific modules (introduced early in the collaborative) followed by a re-enforcement of the sustainability concepts. In addition, the QIC could be structured to incorporate sustainability learning sessions or a coach-led sustainability site visit to re-enforce sustainability concepts or help the organization develop a sustainability plan.

In addition to staff-level factors, organizational characteristics had some notable associations with sustainability levels. Interestingly, affiliation with a free-standing Alcohol and Drug Treatment Program had the most prevalent effect, being associated with higher total sustainability, as well as with the process and organization sustainability sub-domains. For-profit facilities demonstrated higher levels of process sustainability, while rural facilities were associated with greater organizational sustainability. Perhaps not surprisingly, none of the organizational factors were statistically related to the staff sub-domain of sustainability.

Study results suggest, as outlined in the Normalization Process Theory, that (1) staff involvement (working individually and collectively) to implement the change and (2) social processes (coherence, cognitive participation, collection action and reflexive monitoring) are associated with how an innovation is embedded, integrated and sustained within the organization [25, 50, 51]. Specifically, our results demonstrated that increased exposure promotes greater belief in sustainability, which was supported both through combinations of implementation strategies and extent of participation. When added to the multivariate statistical model, an organization’s cumulative participation level in a QIC became a principal predictor of staff SC perceptions, over-riding the effects of the other factors. This finding is consistent with prior research showing that free-standing clinics participated more in NIATx200 [33]. Indeed, the pattern of effects found for the combined implementation strategies seems to be more a function of the hours of participation; when controlling for Total Participation hours, combined strategies are not as predictive of Total Sustainability as individual strategies.

Process sustainability focuses on staff beliefs in the benefits of and the credibility of the evidence for the change; how easy it is to adapt the change to the organization; and the presence of systems to monitor change. Higher levels of process sustainability in for-profit facilities may suggest that the infrastructure (e.g., training or culture) is better suited to support the constructs associated with cognitive participation (e.g., legitimation and buy-in) as well as reflexive monitoring (e.g., monitoring implementation impact) within the Normalization Process Theory [51]. Collective Action emphasizes the organizational resources needed to support change as well as the workability of the change in the organization [25, 50, 51]. In the BNHS-SI, the idea of workability may be associated with adaptability (i.e., ease of adapting the change to fit the organization) with organizational resources being associated with the infrastructure (i.e., policies and procedures and resources) to support the sustainability of change [23, 24]. Rural treatment facilities most likely do not have the resources to invest in changes that will not be sustained and therefore, take steps to establish an infrastructure needed to support and sustain change.

The results suggest that repeat exposure in different implementation strategies to sustainability concepts may help to ingrain within staff the importance of sustaining organizational change. Given these findings, it seems possible to develop and introduce specific sustainability content (e.g., how to develop and implement a sustainability plan) within the structure of a QIC as a means to assess the impact on staff SC perceptions over time and the sustainment of organizational change. The effectiveness of a QIC with a sustainability component, compared to one without, could be evaluated within a randomized control trial that assigns organizations to the implementation strategy using baseline staff perceptions about organizational readiness or sustainability capacity.

The BNHS-SI assesses the six sustainability constructs from the CFSC that consistently have been the most commonly found in at least 75% of both the sustainability approaches (retrospective versus prospective) and level of focus (organizational versus intervention) within the CFSC framework [21]. Table 5 outlines the conceptual relationships between CFSC, and ISF constructs, the prominent attributes associated with these constructs and how these attributes are measured within the BNHS-SI. For example, staff involvement and training emphasize the need for orienting and training staff to be able to deliver the initiative successfully in the CFCS and aligns with two constructs in the ISF Process domain (i.e., partnership/engagement and training/support/supervision). However, the idea measured in the BNHS is that staff have been involved from the beginning of the change and are adequately trained to sustain the improved process. Study results indicated that each hour of total participation in the NIATx QIC increased staff perceptions, as measured by the BNHS-SI, related to these six common sustainability constructs within the CFCS. Future research can attempt to directly replicate these findings by integrating the BNHS-SI as a measurement tool within the CFCS.

Table 5 Comparison of Conceptually Analogous Constructs from Consolidated Framework for Sustainability Constructs (CFSC), Integrated Sustainability Framework (ISF) and the British National Health Services Sustainability Index (BNHS-SI)

Further research should explore exactly how these organizational characteristics independently and within the Normalization Process Theory impact participation in a QIC and staff perceptions that changes will be sustained.

Evidence suggests that factors associated with sustainability include the fit of the innovation or change within the organizational culture or a culture that encourages flexibility and adaptability [52, 53], and one items in the organizational sub-process in the BNHS-SI assesses the degree that staff perceives the innovation as fitting within the organizations’ strategic aims and culture. Our study found that staff perceptions in this sub-process changed over time and that participation was associated with the improvement. Furthermore, another study identified six guiding principles associated with the sustainability of a culture change [54]. Several enabling factors associated with these principles are assessed, in part, by the BNHS-SI, such as:

  • the perceived value of organizational data (Principle: Continuously assess and learn from cultural change and assessed within the Process sub-domain in the BNHS-SI);

  • willingness to relinquish control (Principle: Promote staff engagement and assessed within the Staff sub-domain in the BNHS-SI); and

  • perception that the change is legitimate and credible (Principle: Align vision and action and assessed within the Process sub-domain in the BNHS-SI).

The association of these factors with sustained cultural change would suggest that staff perceptions of organizational sustainability might be associated with the organizational culture that drives those perceptions. However, we did not assess either the changes in organizational culture or a relationship between culture and staff perceptions about sustainability. Future research should explore the role of organizational culture on staff perceptions about sustainability.

Strengths and limitations

Several strengths characterize this study. First, the study design involved data collection at four points in time, allowing prospective analyses. Second, the use of a unique identifier algorithm permitted a repeated measures statistical approach. The same cases could be tracked for changes across time, allowing for a profile of the evaluation of individuals’ perceptions about sustainability. Finally, the structure of the implementation strategies within the NIATx200 initiative allowed for clinic-level participation to be tracked within the assigned implementation strategy.

Although this study had strengths, certain limitations also need to be considered. First, as mentioned previously, the data collected do not directly represent staff-level participation in QIC activities. Participation was calculated for each clinic, but still ultimately depended on staff involvement. Second, although repeated-measures analysis allowed for profiles of unique, individual, cases across time, case variation between the time phases might have existed. However, the repeated covariance model likely lessened the impact of such variation. Third, while the study comprised a 27-month intervention timeframe, consisting of baseline measurement and three 9-month intervals, there was some variation in the 9-month periods when staff surveys were completed (Fig. 1). Fourth, the sample included only responses from linked staff surveys over the four data collection time periods. Although we had staff responses from 83.6% of the participating clinics, the exclusion of certain clinics precluded controlling for successful improvement in the outcomes (wait time, retention, or admission) or exploring how success could influence staff attitudes towards sustainability. Finally, the data represented limited staff demographic information, related to employment only. Further application of the NIATx framework, which utilizes a broader variety of staff characteristics, could provide additional insights into their influences on sustainability propensity.


Sustainability of organizational change represents an increasingly important focus of implementation research. Research has shown that the scientific evidence for how staff perceptions about organizational sustainability capacity, as well as what influences changes over time, represent a gap in dissemination and implementation research. These findings addressed this recognized gap in the literature. Although staff perceptions about sustainability capacity changed over time, this analysis determined that staff participation, representing the level of involvement in the assigned implementation strategy, is the most significant contributor that influenced changes in staff perception about sustainability propensity over time. The impact of participation “dose exposure” on sustainability perceptions highlight the need for dissemination and implementation strategies to re-enforce concepts associated with sustainability to improve staff perceptions about the sustainability capacity of the organization.

Since a QIC is often utilized in implementation efforts, a recognition that staff participation influences staff perceptions about sustainability can inform its design and structure and could provide a foundational step toward determining how change is sustained in substance abuse clinics. The unique perspectives from this study address recognized gaps in the literature, including when and how to incorporate sustainability concepts in a QIC to increase the likelihood that changes will be sustained. Knowing that participation in a QIC influences staff perceptions, researchers and practitioners can work to improve the structure of a QIC by developing activities or interventions (e.g., develop and introduce specific sustainability content) or modify implementation strategies (e.g., coaching for sustainability). Such efforts might promote greater staff involvement or participation in the QIC. Future research could assess the impact of these changes on the relationship between how staff perceptions of sustainability capacity change over time and whether a change is successfully sustained in the organization.

Availability of data and materials

Datasets generated and/or analyzed during the current study are not publicly available because the data contains potentially identifying information but are available from the corresponding author on reasonable request.



British National Health Services Sustainability Index


Consolidated Framework for Sustainability Constructs


Institutional Review Board


Integrated Sustainability Framework


National Institute on Drug Abuse


Network for the Improvement of Addiction Treatment


Program Sustainability Assessment Tool


Quality Improvement Collaborative


Sustainability Capacity


  1. Schell S, Luke D, Schooley M, Elliott M, Herbers S, Mueller N, Bunger A. Public health program capacity for sustainability: a new framework. Implement Sci. 2013;8:15.

    Article  PubMed  PubMed Central  Google Scholar 

  2. SAMHSA: Substance Abuse Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. Behavioral Health Services Information Series: National Directory of Drug and Alcohol Abuse Treatment Facilities 2015, NSDUH series H-48, HHS publication no.(SMA) 16-4940. In Substance Abuse and Mental Health Services Administration, Rockville, MD; 2015.

  3. Substance Abuse and Mental Health Services Administration: Key substance use and mental health indicators in the United States: Results from the 2018 National Survey on Drug Use and Health. (Center for Behavioral Health Statistics and Quality SAaMHSA ed. Rockville, MD; 2019.

  4. Ford JH II, Green CA, Hoffman KA, Wisdom JP, Riley KJ, Bergmann L, Molfenter T. Process improvement needs in substance abuse treatment: admissions walk-through results. J Subst Abus Treat. 2007;33:379–89.

    Article  Google Scholar 

  5. Stumbo SP, Ford JH, Green CA. Factors influencing the long-term sustainment of quality improvements made in addiction treatment facilities: a qualitative study. Addict Sci Clin Pract. 2017;12:26.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Alanis-Hirsch K, Croff R, Ford JH 2nd, Johnson K, Chalk M, Schmidt L, McCarty D. Extended-release naltrexone: a qualitative analysis of barriers to routine use. J Subst Abus Treat. 2016;62:68–73.

    Article  Google Scholar 

  7. Croff R, Hoffman K, Alanis-Hirsch K, Ford J, McCarty D, Schmidt L. Overcoming barriers to adopting and implementing pharmacotherapy: the medication research partnership. J Behav Health Serv Res. 2019;46:330–9.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38:4–23.

    Article  Google Scholar 

  9. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101:2059.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Ford JH 2nd, Alagoz E, Dinauer S, Johnson KA, Pe-Romashko K, Gustafson DH. Successful organizational strategies to sustain use of A-CHESS: a mobile intervention for individuals with alcohol use disorders. J Med Internet Res. 2015;17.

  12. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. An organizational perspective on the long-term sustainability of a nursing best practice guidelines program: a case study. BMC Health Serv Res. 2015;15:535.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. The sustainability of healthcare innovations: a concept analysis. J Adv Nurs. 2015;71:1484–98.

    Article  PubMed  Google Scholar 

  14. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. A unit-level perspective on the long-term sustainability of a nursing best practice guidelines program: an embedded multiple case study. Int J Nurs Stud. 2016;53:204–18.

    Article  PubMed  Google Scholar 

  15. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. Nursing unit leaders' influence on the long-term sustainability of evidence-based practice improvements. J Nurs Manag. 2016;24:309–18.

    Article  PubMed  Google Scholar 

  16. Bond GR, Drake RE, Becker DR, Noel VA. The IPS learning community: a longitudinal study of sustainment, quality, and outcome. Psychiatr Serv. 2016;67:864–9.

    Article  PubMed  Google Scholar 

  17. Peterson AE, Bond GR, Drake RE, McHugo GJ, Jones AM, Williams JR. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis. J Behav Health Serv Res. 2014;41:337–46.

    Article  PubMed  Google Scholar 

  18. Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, Roesch SC. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm Policy Ment Health Ment Health Serv Res. 2016;43:991–1008.

    Article  Google Scholar 

  19. Hunter SB, Han B, Slaughter ME, Godley SH, Garner BR. Predicting evidence-based treatment sustainment: results from a longitudinal study of the adolescent-community reinforcement approach. Implement Sci. 2017;12:75.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Shelton RC, Cooper BR, Stirman SW. The Sustainability of Evidence-Based Interventions and Practices in Public Health and Health Care. Annu Rev Public Health. 2018.

  21. Lennox L, Maher L, Reed J. Navigating the sustainability landscape: a systematic review of sustainability approaches in healthcare. Implement Sci. 2018;13:27.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  22. Luke DA, Calhoun A, Robichaux CB, Elliott MB, Moreland-Russell S. The program sustainability assessment tool: a new instrument for public health programs. Prev Chronic Dis. 2014;11:130184.

    Article  PubMed  Google Scholar 

  23. Sustainability Model and Guide [Retrieved from on December 12, 2018].

  24. Doyle C, Howe C, Woodcock T, Myron R, Phekoo K, McNicholas C, Saffer J, Bell D. Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implement Sci. 2013;8:127.

    Article  PubMed  PubMed Central  Google Scholar 

  25. May CR, Mair F, Finch T, MacFarlane A, Dowrick C, Treweek S, Rapley T, Ballini L, Ong BN, Rogers A. Development of a theory of implementation and integration: normalization process theory. Implement Sci. 2009;4:29.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Chambers D, et al. Proceedings from the 11th annual conference on the science of dissemination and implementation : Washington, DC, USA. 3-5 December 2018. IS. 2019;14:27.

    Google Scholar 

  27. Hoffman KA, Ford JH 2nd, Choi D, Gustafson DH, McCarty D. Replication and sustainability of improved access and retention within the network for the improvement of addiction treatment. Drug Alcohol Depend. 2008;98:63–9.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Hoffman KA, Ford JH, Tillotson CJ, Choi D, McCarty D. Days to treatment and early retention among patients in treatment for alcohol and drug disorders. Addict Behav. 2011;36:643–7.

    Article  PubMed  PubMed Central  Google Scholar 

  29. McCarty D, Gustafson DH, Wisdom JP, Ford J, Choi D, Molfenter T, Capoccia V, Cotter F. The network for the improvement of addiction treatment (NIATx): enhancing access and retention. Drug Alcohol Depend. 2007;88:138–45.

    Article  PubMed  Google Scholar 

  30. Roosa M, Scripa JS, Zastowny TR, Ford JH 2nd. Using a NIATx based local learning collaborative for performance improvement. Eval Program Plann. 2011;34:390–8.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Quanbeck AR, Gustafson DH, Ford JH 2nd, Pulvermacher A, French MT, McConnell KJ, McCarty D. Disseminating quality improvement: study protocol for a large cluster-randomized trial. Implement Sci. 2011;6:44.

    Article  PubMed  PubMed Central  Google Scholar 

  32. McConnell KJ, Hoffman KA, Quanbeck A, McCarty D. Management practices in substance abuse treatment programs. J Subst Abus Treat. 2009;37:79–89.

    Article  Google Scholar 

  33. Grazier KL, Quanbeck AR, Oruongo J, Robinson J, Ford JH 2nd, McCarty D, Pulvermacher A, Johnson RA, Gustafson DH. What influences participation in QI? A randomized trial of addiction treatment organizations. J Healthc Qual. 2015;37:342–53.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Gustafson DH, Quanbeck AR, Robinson JM, Ford JH 2nd, Pulvermacher A, French MT, McConnell KJ, Batalden PB, Hoffman KA, McCarty D. Which elements of improvement collaboratives are most effective? A cluster-randomized trial. Addiction. 2013;108:1145–57.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Mahomed OH, Asmall S, Voce A. Sustainability of the integrated chronic disease management model at primary care clinics in South Africa. Afr J Primary Health Care Family Med. 2016;8.

  36. Van Heerden C, Maree C, Janse van Rensburg ES. Strategies to sustain a quality improvement initiative in neonatal resuscitation. Afr J Primary Health Care Family Med. 2016;8:1–10.

    Google Scholar 

  37. Sayer NA, Rosen CS, Bernardy NC, Cook JM, Orazem RJ, Chard KM, Mohr DC, Kehle-Forbes SM, Eftekhari A, Crowley J. Context matters: team and organizational factors associated with reach of evidence-based psychotherapies for PTSD in the veterans health administration. Adm Policy Ment Health Ment Health Serv Res. 2017;44:904–18.

    Article  Google Scholar 

  38. Ford JH 2nd, Krahn D, Oliver KA, Kirchner J. Sustainability in primary care and mental health integration projects in veterans health Administration. Qual Manag Health Care. 2012;21:240–51.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Ford JH 2nd, Krahn D, Wise M, Oliver KA. Measuring sustainability within the veterans Administration mental health system redesign initiative. Qual Manag Health Care. 2011;20:263–79.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Ford JH 2nd, Wise M, Krahn D, Oliver KA, Hall C, Sayer N. Family care map: sustaining family-centered care in Polytrauma rehabilitation centers. J Rehabil Res Dev. 2014;51:1311–24.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Johnson JE, Wiltsey-Stirman S, Sikorskii A, Miller T, King A, Blume JL, Pham X, Simas TAM, Poleshuck E, Weinberg R. Protocol for the ROSE sustainment (ROSES) study, a sequential multiple assignment randomized trial to determine the minimum necessary intervention to maintain a postpartum depression prevention program in prenatal clinics serving low-income women. Implement Sci. 2018;13:115.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Marini AL, Khan R, Mundekkadan S. Multifaceted bundle interventions shown effective in reducing VAP rates in our multidisciplinary ICUs. BMJ Open Quality 2016. 5:u205566–w202278.

  43. Kastner M, Sayal R, Oliver D, Straus SE, Dolovich L. Sustainability and scalability of a volunteer-based primary care intervention (health TAPESTRY): a mixed-methods analysis. BMC Health Serv Res. 2017;17:514.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Ploeg J, Ireland S, Cziraki K, Northwood M, Zecevic AA, Davies B, Murray MA, Higuchi K. A sustainability oriented and mentored approach to implementing a fall prevention guideline in acute care over 2 years. SAGE Open Nursing. 2018;4:2377960818775433.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Ford JH, Stumbo SP, Robinson JM. Assessing long-term sustainment of clinic participation in NIATx200: results and a new methodological approach. J Subst Abus Treat. 2018;92:51–63.

    Article  Google Scholar 

  46. Higuchi KS, Downey A, Davies B, Bajnok I, Waggott M. Using the NHS sustainability framework to understand the activities and resource implications of Canadian nursing guideline early adopters. J Clin Nurs. 2013;22:1707–16.

    Article  PubMed  Google Scholar 

  47. Knapp H, Hagedorn H, Anaya HD. HIV rapid testing in a veterans affairs hospital ED setting: a 5-year sustainability evaluation. Am J Emerg Med. 2014;32:878–83.

    Article  PubMed  Google Scholar 

  48. Higuchi KS, Davies B, Ploeg J. Sustaining guideline implementation: a multisite perspective on activities, challenges and supports. J Clin Nurs. 2017;26:4413–24.

    Article  PubMed  Google Scholar 

  49. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A. Standards for reporting implementation studies (StaRI) statement. Bmj. 2017;356:i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  50. May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. 2009;43:535–54.

    Article  Google Scholar 

  51. McEvoy R, Ballini L, Maltoni S, O’Donnell CA, Mair FS, MacFarlane A. A qualitative systematic review of studies using the normalization process theory to research implementation processes. Implement Sci. 2014;9:2.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Palinkas LA, Spear SE, Mendon SJ, Villamar J, Reynolds C, Green CD, Olson C, Adade A, Brown CH. Conceptualizing and measuring sustainability of prevention programs, policies, and practices. Transl Behav Med. 2020;10:136–45.

    Article  PubMed  Google Scholar 

  53. Xiang X, Robinson-Lane SG, Rosenberg W, Alvarez R. Implementing and sustaining evidence-based practice in health care: the bridge model experience. J Gerontol Soc Work. 2018;61:280–94.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Willis CD, Saul J, Bevan H, Scheirer MA, Best A, Greenhalgh T, Mannion R, Cornelissen E, Howland D, Jenkins E, Bitz J. Sustaining organizational culture change in health systems. J Health Organ Manag. 2016;30:2–30.

    Article  PubMed  Google Scholar 

Download references


The authors would like thank Ka Xiong for her review of the manuscript. We would also like to acknowledge the thousands of very dedicated, very busy staff from the substance abuse clinics in Massachusetts, Michigan, New York, Oregon, and Washington who participated in the NIATx200 study as well as members of the original NIATx200 research team. An earlier version of this research was presented at the 11th Annual Conference on the Science of Dissemination and Implementation.


This study was funded by NIDA (R01 DA020832, R21 DA36700). NIDA was not involved in data collection, data analysis or writing of this paper. The statements made here are those of the authors.

Author information

Authors and Affiliations



All authors made significant contributions to this manuscript. JHF planned and implemented the study design, was involved in the data collection, and led the development and writing of the manuscript. AMG conducted the data analysis and was a major contributor in writing the manuscript. All authors read and approved the final manuscript. Portions of the research was presented at the 11th Annual Dissemination and Implementation conference.

Corresponding author

Correspondence to James H. Ford II.

Ethics declarations

Ethics approval and consent to participate

This study has been reviewed and was approved by was approved by the University of Wisconsin Social and Behavioral Sciences Institutional Review Board (SE-2006-0521) and Health Sciences minimal risk IRB (2014–1048).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Sustainability Construct by Assessment Timing and Level of Focus.

Additional file 2.

Blank British National Health Service Survey.

Additional file 3.

British National Health Services Sustainability Index Scoring Guide.

Additional file 4.

StaRI Checklist.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ford, J.H., Gilson, A. Influence of participation in a quality improvement collaborative on staff perceptions of organizational sustainability. BMC Health Serv Res 21, 34 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: