Skip to main content

Specifying sickle cell disease interventions: a study protocol of the Sickle Cell Disease Implementation Consortium (SCDIC)

Abstract

Background

Sickle cell disease (SCD) is an inherited blood disorder that results in a lifetime of anemia, severe pain, and end-organ damage that can lead to premature mortality. While the SCD field has made major medical advances, much needs to be done to improve the quality of care for people with SCD. This study capitalizes on the Sickle Cell Disease Implementation Consortium (SCDIC), a consortium of eight academic sites aiming to test implementation strategies that could lead to more accelerated application of the NHLBI guidelines for treating SCD. This report documents the process to support the consortium by specifying the interventions being developed.

Methods

This study consists of three steps. The Principal Investigator of each site and two site representatives who are knowledgeable of the intervention (e.g., study coordinator or the person delivering the intervention) will answer an online survey aiming to capture components of the interventions. This survey will be completed by the site representatives three times during the study: during the development of the interventions, after one year of the interventions being implemented, and at the end of this study (after 2 years). A site visit and semi-structured interview (Step 2) in the first year of the process will capture the context of the sites. Step 3 comprises of the development of a framework with the details of the multi-component SCDIC interventions at the sites.

Discussion

The outcome of this study, a framework of the SCDIC, will enable accurate replication and extension of published research, facilitating the translation of SCD studies to diverse populations and settings and allowing for theory testing of the effects of the intervention components across studies in different contexts and for different populations.

Trial registration

ClinicalTrial.Gov (#NCT03380351). Registered December 21, 2017.

Background

Sickle cell disease (SCD) is an inherited blood disorder that results in a lifetime of anemia, severe pain, and end-organ damage that can lead to premature mortality [1]. SCD predominantly affects African Americans and other underrepresented minorities [2] and, while the field has made major medical advances [3], the average survival of African American individuals with SCD remains 30–45 years less than the average life expectancy for African Americans [4]. There are, however, efficacious interventions for SCD that can improve quality of life and life expectancy by reducing risks of stroke, pulmonary complications, need for transfusions, and reducing the number of pain episodes [5, 6]. Fortunately, the diagnosis of the disease is relatively simple, and mandatory newborn screening for SCD exists in all states so such interventions can be initiated early in the lifespan [7]. In 2014, the National Heart, Lung and Blood Institute released a set of evidence-based recommendations for treating SCD [8] but it is clear that the therapies proven to be efficacious are not reaching those in need [8, 9]. Implementation science, a field that aims to close the “quality gap” and support the transition of evidence-based intervention to usual care [10] can help improve the field of SCD care delivery [11].

Distinguishing between clinical studies and implementation studies may help clarify how implementation science can help improve the quality of care for people with SCD. Clinical studies aim to evaluate the efficacy or effectiveness of a specific intervention. Implementation studies, on the other hand, focus on how to successfully implement the intervention in specific contexts [12]. An implementation study aims to accelerate the adoption of an intervention or guideline by providers and/or systems of care, with the focus of the outcomes usually being at the provider and/or systems level [13]. In implementation studies, attention is given to the implementation strategies, defined as the processes by which evidence-based health innovations (e.g., audit and feedback, changes in the electronic record system, clinical supervision) are adopted and integrated into usual care [14]. These studies aim to change behaviors or settings at the organizational, practitioner, or patient level [15] with the goal of enhancing the adoption of an intervention or guideline. Implementation studies, therefore, require a paradigm shift [16] as they extend efficacy and effectiveness research focused on discovering what works to understanding how the implementation works in specific contexts [17].

The context of SCD care is complex and permeated with disparities. Treatment for SCD relies heavily on public insurance and healthcare programs [2], there are few specialized treatment centers [7], and the majority of people with SCD are minority and socioeconomically disadvantaged people who are also less likely than their counterparts to be linked to quality systems of care [18]. Children with SCD are at risk for life-threatening infections, strokes, acute chest syndrome, leg ulcers, pulmonary hypertension, among other complications [19]. Managing the disease involves multiple settings such as emergency department, as inpatient or outpatient, as well as at home and, for children, in school [20]. Proper management also includes both primary care and specialty care, though the latter may not always be available. Managing SCD is particularly complicated when adolescents are transitioning into adulthood, as they may have to move from a children’s clinic to adult clinics, adult emergency rooms, and hospitals. This may also be a time when individuals are taking on more responsibility for managing their own disease complications on their own, entering the workforce, and taking on other adult responsibilities [21].

The sickle cell disease implementation consortium (SCDIC)

To address the challenges in managing SCD at this critical age juncture, the National Heart, Lung and Blood Institute (NHLBI), with co-sponsorship from the National Institute of Minority Health and Health Disparities, established a national research consortium to identify and test implementation strategies that could lead to more accelerated, consistent, and widespread application of the NHLBI guidelines. The Sickle Cell Disease Implementation Consortium (SCDIC): Using Implementation Science to Optimize Care of Adolescents and Adults with Sickle Cell Disease was established in 2016 [22] as a collaborative, multi-sector, research program that supports eight academic sites facilitated by a Data Coordinating Center – the Research Triangle Institute [11, 23]. The Consortium is charged with improving “the health and well-being of adolescents and adults with SCD in the US through the development of multi-modal, multi-sector interventions aimed at improving the rate at which patients with SCD receive routine primary care” [22].

Originally, each of the eight sites proposed a different intervention to achieve a common goal of improving the quality of care of people with SCD. To assist in cross-site collaboration, the SCDIC Executive Committee grouped site representatives from each of the eight academic sites to develop interventions in three main areas: (a) care redesign, (b) emergency department care, and (c) reducing the number of unaffiliated patients’ care. The specific research questions, designs and components of the interventions are currently being developed.

Aims

The process described here was developed to support the sites in the SCDIC consortium by defining the components of the interventions with the goal of increasing the transparency and scientific reproducibility of the complex and multilevel studies being developed by the consortium sites.

The importance of scientific reproducibility

One of the cornerstones of science is the ability to reproduce research findings [24]. In recent years, there has been a growing awareness of the need for transparency in studies to increase scientific reproducibility, including the launch of the clinicaltrials.gov in 2000, a registry where researchers record their methods and outcomes measures, and development of guidelines such as CONSORT [25], TIDier [26], and STaRI [27], to support transparency in research. Ideally these initiatives would facilitate replicability of major findings; however a number of studies have failed to replicate prominent findings or failed to achieve expected outcomes [28,29,30,31]. Insights from researchers reveal that, among other issues, the absence of clear descriptions of studies, including hypotheses for outcomes and descriptions of their interventions strategies [31, 32] make these studies virtually unusable, as they cannot be replicated, leading to inefficient use of research efforts and millions of dollars and efforts [33]. Unfortunately, the lack of detailed description of the interventions, including theoretical justification and review of the literature, is still prominent despite the push for publication guidelines [33, 34].

Describing components of complex interventions in the healthcare system, however, is challenging as often the relation between intervention, the implementation strategies and the contextual factors are dynamic, making the precise distinction among these components less clear [35]. A priori definition of the intervention components, the context, and the strategies used to implement the interventions can hopefully address many of the current problems in the intervention and implementation science fields: inconsistent labelling, disciplinary differences, vague descriptions (lack of transparency) and consequent difficult replication, and resultant inability to execute more precise meta-analyses [27, 35,36,37,38,39,40]. In other words, a standardized and detailed description of interventions allows accurate replication and extension of published research, facilitating the translation of studies to other populations and settings, allowing for theory testing of the effects of the intervention components across studies in different contexts and for different populations.

The description of a study needs to go beyond the correct and detailed description of methods and outcomes. A challenge to implementation research in any field is to capture what was proposed and what actually happened. There are at least two consequences of such mismatch and lack of transparency. First, by not adequately describing the study objective, methods, populations, interventions, outcomes measured and context, authors do not address, with high quality, their protocols in a proactive manner [33]. By proactively describing their studies in publications such as study protocols, researchers can contextualize their studies, allowing others to place their studies “into the context of totality of evidence” [41], and allowing others to reduce research waste by potentially doing similar studies when evidence has already been established. Second, it is well known that proposed studies differ from what actually happens when interventions are implemented [42, 43]. By clearly defining upfront the details of the interventions, and proactively defining what can be adapted, when and how adaptations can be made, it is hopefully easier to capture potential adaptations to the interventions once the study is ongoing [42]. In summary: “without accessible and usable report, research cannot help patients and their clinicians” [44].

To support the SCDIC, the process describe here aims to faciliatate the characterization of the interventions and strategies being developed by the SDIC in an effort to compare interventions. The primary outcome of this process will be a detailed framework of the SCDIC interventions that will support rigor and transparency, enable the Consortium to evaluate which intervention components were most effective in which settings, and will offer guidance to future efforts to improve treatment delivery and improve quality of life and life expectancy for adults living with SCD. This approach will also serve as a model for how other Consortia can describe, delineate, and evaluate interventions and implementation strategies across diverse sites.

Methods

Guided by the recent report by the National Academy of Engineering (NAE) and the Institute of Medicine (IOM) calling for applying systems engineering methods to healthcare systems [45], we will focus on different levels that are interdependent and necessary to improve the quality of care. Accordingly, in the past years, approaches from the fields of engineering and business have been used to identify components of interventions [36, 46, 47] as predictors of outcomes [46, 48, 49]. This study is approved by the Washington University in St. Louis Human Research Protection Office (IRB Protocol # 201709005) and registered in the ClinicalTrial.Gov (#NCT03380351).

Participants

Each site will have three representatives who will participate in this process: the PI of the study at that site and two people nominated by the PI as being the most knowledgeable of the intervention (e.g., study coordinator, interventionist/clinician). The interviews will be conducted in a half-day site visit. All data are being collected with the site representatives and there will be no contact with the patient participants of the SCDIC interventions. Because this study entails describing the interventions and how the interventions are delivered, the IRB has approved an information sheet to be provided to the site representatives instead of a written informed consent for the study.

Data collection

The data collection for this study will have three phases: Step 1 will be a survey conducted prior to intervention implementation to capture the details of the proposed intervention. After one year, the survey will be conducted again, in addition to site visits (Step 2) to capture any differences between what was proposed and what is actually being implemented. Finally, Step 3 – which will take place on the second year 3 of this study, involves the sites’ representatives answering the survey again and developing a framework with detailed description of the SCDIC interventions.

Step 1: Developing the matrix

An online survey, adapted from a survey used in other consortia [50] will be developed to characterize the interventions. The survey, with multiple response options, will capture nine components of the interventions: (1) mode of delivery (e.g., face to face, video, phone), (2) materials used in the delivery of the intervention (e.g., manuals, pamphlets, videotapes), (3) where the intervention is being delivered (e.g., emergency department, primary care clinic, schools), (4) duration and intensity of the intervention (e.g., number of meetings, distribution of meetings over time), (5) scripting or how researchers or implementers plan to interact with providers and participants of the study (e.g., language provided to support interaction, general guidelines), (6) sensitivity to participant characteristics (e.g., visual supplements, oral supplements, literacy level); (7) interventionist characteristics (e.g., who is delivering the intervention, demographics of interventionist, how much training is required to deliver the intervention), (8) adaptability, or the extent to which an intervention can be modified (e.g., what, on what basis and when components can be modified), and (9) outcomes and hypothesis (i.e., planned outcomes being assessed, theoretical rationale for the outcomes, hypothesis being tested; when, and who will conduct the assessments).

To capture the implementation strategies, the survey will also incorporate recommendations on specifying and reporting implementation strategies [36] by asking the research teams to identify (a) who is delivering the intervention, (b) what are the actions, (c) who is the action target, (d) the temporality (i.e., when the strategy is being used), (e) the dose (i.e., how often the strategy is being used), (f) the implementation outcome likely affected by the strategy, and (g) the theoretical justification for using the strategy.

Step 2: Site visits and follow up calls

Because we know that the interventions proposed will probably be adapted during the implementation process [42], we will ask the site representatives to answer the survey one year after the interventions are being implemented. Additionally, the study team will do a half-day site visit to each of the sites to learn more about the context of each study. If needed, follow up calls will be done to support the completion of the matrix.

Step 3: Refining and agreeing with the framework

At the end of the study, representatives of the sites will answer the survey again. The framework, which will be based on the three data points, and will contain what was proposed compared to what was actually delivered, will be presented to all sites in a webinar meeting during which feedback will be gathered regarding accuracy of the data.

Data analysis

Simple descriptive, data comparative, and correlational statistics will be used to analyze data captured via the survey. The interviews will be transcribed and data analysis will be based upon principles from grounded theory using emergent coding strategies [51,52,53,54,55]. Coding will begin with open coding of the transcripts to develop topics and codes, and then a codebook will be developed that will list codes, code definitions, and sample code data. The codebook will be tested and iteratively revised until coder agreement is high and no modifications to the codebook is deemed necessary. At least two raters (the PI and a research assistant) will code each transcript using the codebook. Code queries will be used to examine all data under each code, which will be reviewed and analyzed to determine common meanings and themes. We will evaluate thematic saturation (i.e., no new themes will be captured with further analysis) and follow the Consolidated criteria for reporting qualitative research (COREQ) guidelines when reporting our findings [56].

Discussion

This study is timely and important. First, this is a unique opportunity to support a consortium of eight academic sites aiming to improve the quality of life of adolescents and adults with SCD. By doing it prospectively instead of retrospectively, as has been done with other consortia [47, 50, 57, 58], one can compile the data and report intervention characteristics as they are being developed and implemented. This proactive approach should facilitate identifying and helping to address differences in labelling by the different stakeholders in the consortium (e.g., the word “implementation” is often used with different meanings by hematologists and implementation scientists) and increase the transparency in descriptions of the interventions. Additionally, by asking site representatives to describe the study objective, methods, populations, interventions, and outcomes in “the context of totality of evidence” [41], we hope to reduce research waste and push the field of SCD further.

By clearly detailing the interventions and planning for potential adaptations prospectively, and surveying site representatives at the end of the studies, the study described here will contribute to the fields of SCD and implementation science as we will be able to capture the differences in what was proposed and what was actually delivered. It is our hope that this process will help us delineate methodologies to capture the adaptations made to interventions being implemented in different settings (e.g., hospitals, schools) as part of this rich consortium.

In summary, the framework of the SCDIC will provide accurate replication and extension of published research, facilitating the translation of studies to diverse populations and settings, allowing for theory testing of the effects of the intervention components across studies in different contexts and for different populations [46, 57, 58].

Abbreviations

SCD:

Sickle cell disease

SCDIC:

Sickle cell disease consortium

References

  1. Kanter J, Kruse-Jarres R. Management of sickle cell disease from childhood through adulthood. Blood Rev. 2013;27(6):279–87. https://doi.org/10.1016/j.blre.2013.09.001.

    Article  PubMed  Google Scholar 

  2. Hassell KL. Population estimates of sickle cell disease in the U.S. Am J Prev Med. 2010;38(4):S512–21. https://doi.org/10.1016/j.amepre.2009.12.022.

    Article  PubMed  Google Scholar 

  3. Adams RJ, McKie VC, Hsu L, et al. Prevention of a first stroke by transfusions in children with sickle cell anemia and abnormal results on transcranial doppler ultrasonography. N Engl J Med. 1998;339(1):5–11. https://doi.org/10.1056/NEJM199807023390102.

    Article  PubMed  CAS  Google Scholar 

  4. The Henry J Kaiser Family Foundation. State health facts - life expectancy at birth (in years), by race/ethnicity 2015. http://kff.org/other/state-indicator/life-expectancy-by-re/. Accessed 19 Mar 2018.

  5. Charache S, Terrin ML, Moore RD, et al. Effect of hydroxyurea on the frequency of painful crises in sickle cell anemia. N Engl J Med. 1995;332(20):1317–22. https://doi.org/10.1056/NEJM199505183322001.

    Article  PubMed  CAS  Google Scholar 

  6. Wang WC, FAU WR, FAU MS, et al. Hydroxycarbamide in very young children with sickle-cell anaemia: a multicentre, randomised, controlled trial (BABY HUG). Lancet. 2011;377(9778):1663–72. https://doi.org/10.1016/S0140-6736(11)60355-3.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  7. Yusuf HR, Lloyd-Puryear M, Grant AM, Parker CS, Creary MS, Atrash HK. Sickle cell disease. Am J Prev Med. 2011;41(6):S376–83. https://doi.org/10.1016/j.amepre.2011.09.007.

    Article  PubMed  Google Scholar 

  8. National Heart Lung and, Blood Institute. Evidence-based management of sickle cell disease: Expert panel report. 2014. https://www.nhlbi.nih.gov/health-topics/evidence-based-management-sickle-cell-disease. Accessed 19 Mar 2018.

  9. Hassell KL. Sickle cell disease. Am J Prev Med. 2016;51(1):S1–2. https://doi.org/10.1016/j.amepre.2015.11.002.

    Article  PubMed  Google Scholar 

  10. Eccles MP, Mittman B. Welcome to implementation science. Implement Sci. 2006;1(1). https://doi.org/10.1186/1748-5908-1-1.

  11. King AA, Baumann AA. Sickle cell disease and implementation science: a partnership to accelerate advances. Pediatr Blood Cancer 2017;00e26649:e26649-n/a. https://doi.org/10.1002/pbc.26649.

  12. Flay BR, Biglan A, Boruch RF, et al. Standards of evidence: criteria for efficacy, effectiveness and dissemination. Prev Sci. 2005;6(3):151–75. https://doi.org/10.1007/s11121-005-5553-y.

    Article  PubMed  Google Scholar 

  13. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:217–26. https://doi.org/10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Res Soc Work Pract. 2013;24:192–212. https://doi.org/10.1177/1049731513505778.

    Article  PubMed Central  Google Scholar 

  15. Grimshaw J, Thomas R, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8:1–72. https://doi.org/10.3310/hta8060

    Article  Google Scholar 

  16. Landsverk J, Brown CH, Rolls Reutz J, Palinkas L, Horwitz SM. Design elements in implementation research: a structured review of child welfare and child mental health studies. Admin Pol Ment Health. 2011;38(1):54–63. https://doi.org/10.1007/s10488-010-0315-y

    Article  Google Scholar 

  17. Damschroder L, Petersen D. Using implementation research to guide adaptation, implementation and dissemination of patient-centered medical home models. Rockville: Agency for Healthcare Research and Quality; 2013. AHRQ publication no. 13–0027-EF.

  18. Homer CJ, Oyeku SO. Sickle cell disease. Am J Prev Med. 2016;51(1):S3–4. https://doi.org/10.1016/j.amepre.2015.10.018.

    Article  PubMed  Google Scholar 

  19. Lane PA. Sickle cell disease. Pediatr Clin. 1996;43(3):639–64. https://doi.org/10.1016/S0031-3955(05)70426-0.

    Article  CAS  Google Scholar 

  20. Kim S, Brathwaite RF, Kim O. Evidence-based practice standard care for acute pain management in adults with sickle cell disease in an urgent care center. Qual Manag Health care. 2017;26:108–15. https://doi.org/10.1097/QMH.0000000000000135.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Adams-Graves P, Bronte-Jordan L. Recent treatment guidelines for managing adult patients with sickle cell disease: challenges in access to care, social issues, and adherence. Expert Rev Hematol. 2016;9(6):541–52. https://doi.org/10.1080/17474086.2016.1180242.

    Article  PubMed  CAS  Google Scholar 

  22. National Heart, Lung, and Blood Institute RFA-HL-16-010. Sickle cell disease implementation consortium (SCDIC): Using Implement Sci to optimize care of adolescents and adults with sickle cell disease (U01). 2015.

  23. National Heart Lung and, Blood Institute. NHLBI awards grants to help improve health outcomes for teens, adults with sickle cell disease - NHLBI, NIH. Updated 2016. https://www.nhlbi.nih.gov/news/2016/nhlbi-awards-grants-help-improve-health-outcomes-teens-adults-sickle-cell-disease. Accessed 20 Jan 2018.

  24. National Institutes of Health. Rigor and reproducibility. 2018. https://grants.nih.gov/reproducibility/index.htm. Accessed 5 Apr 2018.

  25. Welch V, Jull J, Petkovic J, et al. Protocol for the development of a CONSORT-equity guideline to improve reporting of health equity in randomized trials. Implement Sci. 2015;10(1):1. https://doi.org/10.1186/s13012-015-0332-z.

    Article  Google Scholar 

  26. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. https://doi.org/10.1136/bmj.g1687.

    Article  PubMed  Google Scholar 

  27. Pinnock H, Epiphaniou E, Sheikh A, et al. Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-delphi. Implement Sci. 2015;10(1) https://doi.org/10.1186/s13012-015-0235-z.

  28. Riley BL, MacDonald J, Mansi O, et al. Is reporting on interventions a weak link in understanding how and why they work? A preliminary exploration using community heart health exemplars. Implement Sci. 2008;3(1):27. https://doi.org/10.1186/1748-5908-3-27.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Merzel C, D’Afflitti J. Reconsidering community-based health promotion: promise, performance, and potential. Am J Public Health. 2003;93(4):557–74. https://doi.org/10.2105/AJPH.93.4.557.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Deschesnes M, Martin C, Hill AJ. Comprehensive approaches to school health promotion: how to achieve broader implementation? Health Promot Int. 2003;(4):387–96. https://doi.org/10.1093/heapro/dag410.

  31. Adjemian R, Atbin MZ, Coombs R, Mickan S, Vaillancourt C. Are emergency department clinical pathway interventions adequately described, and are they delivered as intended? A systematic review. Intern Journal of Care Coordination. 2017;20(4):148–61. https://doi.org/10.1177/2053434517732507.

    Article  Google Scholar 

  32. Audrey S, Holliday J, Parry-Langdon N, Campbell R. Meeting the challenges of implementing process evaluation within randomized controlled trials: the example of ASSIST (a stop smoking in schools trial). Health Educ Res. 2006;21 https://doi.org/10.1093/her/cyl029.

  33. Glasziou P, Altman DG, Bossuyt P, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76. https://doi.org/10.1016/S0140-6736(13)62228-X.

    Article  PubMed  Google Scholar 

  34. Ivers NM, Grimshaw JM. Reducing research waste with implementation laboratories. Lancet. 2016;388(10044):547. https://doi.org/10.1016/S0140-6736(16)31256-9

    Article  PubMed  Google Scholar 

  35. Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13(1):95. https://doi.org/10.1186/1745-6215-13-95.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139. https://doi.org/10.1186/1748-5908-8-139.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implementation Sci. 2009;4(1) https://doi.org/10.1186/1748-5908-4-40.

  38. Davies SL, Goodman C. A systematic review of integrated working between care homes and health care services. BMC Health Serv Res. 2011;11 https://doi.org/10.1186/1472-6963-11-320.

  39. Wieringa S, Greenhalgh T. 10 years of mindlines: a systematic review and commentary. Implement Sci. 2015;10(1). https://doi.org/10.1186/s13012-015-0229-x.

  40. Rycroft-Malone J, Burton CR. Is it time for standards for reporting on research about implementation? Worldviews Evid-Based Nurs. 2011;8(4):189–90. https://doi.org/10.1111/j.1741-6787.2011.00232.x.

    Article  PubMed  Google Scholar 

  41. Clark S, Horton R. Putting research into context--revisited. Lancet. 2010;376(9734):10–1. https://doi.org/10.1016/S0140-6736(10)61001-X.

    Article  PubMed  Google Scholar 

  42. Baumann A, Cabassa L, Wiltsey Stirman S. Adaptation in dissemination and Implement Sci. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: Translating science to Practice. 2nd ed. New York: Oxford University; 2018. p. 285–300.

    Google Scholar 

  43. Lugtenberg M, Zegers-van Schaick JM, Westert GP, Burgers JS. Why don't physicians adhere to guideline recommendations in practice? An analysis of barriers among Dutch general practitioners. Implementation Sci. 2009;4(1):54. https://doi.org/10.1186/1748-5908-4-54.

    Article  Google Scholar 

  44. Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9. https://doi.org/10.1016/S0140-6736(09)60329-9.

    Article  PubMed  Google Scholar 

  45. National Academy of Engineering, Institute of Medicine. Building a better delivery system: A new engineering/health care partnership. Washington, DC: The National Academies Press; 2005. https://www.nap.edu/catalog/11378/building-a-better-delivery-system-a-new-engineeringhealth-care-partnership. https://doi.org/10.17226/11378. Accessed 10 Feb 2018.

  46. Tate DF, Lytle LA, Sherwood NE, et al. Deconstructing interventions: approaches to studying behavior change techniques across obesity interventions. Transl Behav Med. 2016;6(2):236–43. https://doi.org/10.1007/s13142-015-0369-1.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Czaja SJ, Valente TW, Nair SN, Villamar JA, Brown CH. Characterizing implementation strategies using a systems engineering survey and interview tool: a comparison across 10 prevention programs for drug abuse and HIV sexual risk behavior. Implement Sci. 2016;11(1) https://doi.org/10.1186/s13012-016-0433-3.

  48. Belle SH, Stevens J, Cella D, et al. Overview of the obesity intervention taxonomy and pooled analysis working group. Transl Behav Med. 2016;6(2):244–59. https://doi.org/10.1007/s13142-015-0365-5.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Belle SH, Burgio L, Burns R, et al. Enhancing the quality of life of dementia caregivers from different ethnic or racial groups: a randomized, controlled trial. Ann Intern Med. 2006;145(10):727–38. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2585490/.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Schulz R, Czaja SJ, McKay JR, Ory MG, Belle SH. Intervention taxonomy (ITAX): describing essential features of interventions. Am J Health Behav. 2010;34(6):811–21.

    Article  PubMed  PubMed Central  Google Scholar 

  51. James AS, Hudson MA, Campbell MK. Demographic and psychosocial correlates of physical activity among african americans. Am J Health Behav. 2003;27(4):421–31.

    Article  PubMed  Google Scholar 

  52. James A, Daley CM, Greiner KA. "Cutting" on cancer: attitudes about cancer spread and surgery among primary care patients in the U.S.a. Soc Sci Med. 2011;73(11):1669–73. https://doi.org/10.1016/j.socscimed.2011.09.017

    Article  PubMed  PubMed Central  Google Scholar 

  53. Strauss A, Corbin J. Basics of qualitative research: grounded theory procedures and techniques. Newbury Park, CA: Sage Publications; 1998.

    Google Scholar 

  54. Strauss A, Corbin J. Basics of qualitative research: techniques and procedures for developing grounded theory. 2nd ed. Thousand Oaks, CA: Sage. p. 1998.

  55. Craswell JW. Qualitative inquiry and research design choosing among five traditions. 4th ed. Thousand Oaks: SAGE Publications; 2009.

    Google Scholar 

  56. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Intl J Qual in Health Care. 2007;19(6):349–57. https://doi.org/10.1093/intqhc/mzm042.

    Article  Google Scholar 

  57. Czaja SJ, Schulz R, Lee CM, Belle SH. A methodology for describing and decomposing complex psychosocial and behavioral interventions. Psychol Aging. 2003;18:385–95. https://doi.org/10.1037/0882-7974.18.3.385.

    Article  PubMed  Google Scholar 

  58. Belle SH, Schulz R, Zhang S, et al. Using a new taxonomy to combine the uncombinable: integrating results across diverse intervensions. Psychol Aging. 2003;18:396–405. https://doi.org/10.1037/0882-7974.18.3.396.

Download references

Acknowledgments

We would like to thank the support and feedback from the WUNDIR members for their review of the study.

Funding

The SCD Implementation Consortium has been supported by US Federal Government cooperative agreements HL133948, HL133964, HL133990, HL133996, HL133994, HL133997, HL134004, HL134007, and HL134042 from the National Heart Lung and Blood Institute and the National Institute on Minority Health and Health Disparities (Bethesda, MD).

Author information

Authors and Affiliations

Authors

Consortia

Contributions

AB is the Principal Investigator of the study and drafted the manuscript. SB, AJ and AK provided valuable input on the study design and the draft of the manuscript. All authors have read and accepted this version of the manuscript.

Corresponding author

Correspondence to Ana A. Baumann.

Ethics declarations

Ethics approval and consent to participate

This study protocol has been approved by the Washington University in St. Louis IRB Protocol # 201709005.

Consent for publication

The Washington University in St. Louis IRB has approved the exempt of consent for this study.

Competing interests

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baumann, A.A., Belle, S., James, A. et al. Specifying sickle cell disease interventions: a study protocol of the Sickle Cell Disease Implementation Consortium (SCDIC). BMC Health Serv Res 18, 500 (2018). https://doi.org/10.1186/s12913-018-3297-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-018-3297-1

Keywords