Skip to main content

The rise of rapid implementation: a worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review



The concept of rapid implementation has emerged in the literature recently, but without a precise definition. Further exploration is required to distinguish the concept’s unique meanings and significance from the perspective of implementation science. The study clarifies the concept of rapid implementation and identifies its attributes, antecedents, and consequences. We present a theoretical definition of rapid implementation to clarify its unique meaning and characteristics.


Rodgers evolutionary concept analysis method, combined with a systematic integrative review, were used to clarify the concept of rapid implementation. A comprehensive search of four databases, including EMBASE, MEDLINE, SCOPUS, and WEB OF SCIENCE was conducted, as well as relevant journals and reference lists of retrieved studies. After searching databases, 2442 papers were identified from 1963 to 2019; 24 articles were found to fit the inclusion criteria to capture data on rapid implementation from across healthcare settings in four countries. Data analysis was carried out using descriptive thematic analysis.


The results locate the introduction of rapid implementation, informed by implementation science. Guidance for further conceptualisation to bridge the gap between research and practice and redefine rigour, adapting methods used (current approaches, procedures and frameworks), and challenging clinical trial design (efficacy-effectiveness-implementation pipeline) is provided.


It is possible that we are on the cusp of a paradigm shift within implementation brought about by the need for faster results into practice and policy. Researchers can benefit from a deeper understanding of the rapid implementation concept to guide future implementation of rapid actionable results in clinical practice.

Peer Review reports


Implementation may be broadly defined as putting an intervention into effect when delivered in a setting, and is one critical element of evidence-based practice [1]. Implementation science is the rigorous study of implementation, described as the method to promote the uptake of clinical research findings and other evidence-based practice into routine practice and hence improve the quality and effectiveness of healthcare [2]. Unfortunately, these definitions exclude a temporal aspect – that is, how we get what works to the people who need it with the greatest speed and efficiency. Some of the early developers of implementation science recognised this in real-world systems-thinking and methods and began responding to estimates that the time it takes to implement research into clinical practice is 17 years on average, with low uptake of evidence-based findings implemented in practice and poor effect sizes when adopted [3]. This time-gap paradigm has created many challenges for practitioners and policy makers who need rapid, actionable results, such that multiple stakeholders (e.g., practitioners, patients, families, decision-makers, administrator and policy makers) are beginning to question implementation success [4, 5]. This is understandable given the poor outcomes from these necessary but what seems insufficient approaches. There remains a troubling implementation gap, defined as the difference between our knowledge of what works and the time it takes to get that knowledge into practice in real-world settings.

Rapid implementation is an intriguing possibility to narrow the implementation gap. We can ask whether rapid implementation can be informed by implementation science, but it has yet to be defined in the literature, and studies are few and far between; nor has there been a systematic review on rapid implementation studies undertaken to date within the healthcare and medical spheres. This absence impedes our ability to understand and enable rapid, evidence-based findings to find their way quickly into clinical practice [6]. Similarly, the importance of defining concepts has been shown in the work of behaviour change interventions indicating that without standardised behavioural definitions it is difficult to replicate effective interventions and challenging to identify techniques contributing to effectiveness across interventions [7]. By providing a clear definition of rapid implementation, we avoid concerns previously directed at the science and practice of implementation related to poor consistency of terminology for core concepts that resulted in researchers characterising implementation science as a Tower of Babel [8]. A clear definition will ensure that throughout the research or implementation science field we are all talking about rapid implementation in the same way. This will aid the research community to communicate effectively within and between disciplines, and to apply evidence-based research findings [9]. We sought to use concept analysis to provide a theoretical definition and identify essential elements of rapid implementation.

Triangulation of methods has been argued to be the future of implementation science— enhancing understanding of data findings, and as a result, shining a light on research challenges from multiple perspectives [10]. No one method reveals absolute truth or provides a definitive standpoint [11]. A new method combination, concept analysis and systematic integrative review, is introduced in this paper, for the first time to our knowledge, having only been used separately in previous research [12,13,14].

Rapid implementation has a possibility to narrow the evidence-practice gap by addressing the delay of implementing research into practice and is an entirely new concept. Concept analysis is a method for clarifying foundational ideas and is derived from a deep analysis of core elements of a target problem or issue under investigation [15,16,17]. By way of contrast, an integrative review provides a systematic approach to data examination, and considers a range of diverse studies, often traversing both qualitative and quantitative methods, with synthesis and conclusions drawn.

The decision to triangulate concept analysis and integrative review led to the formulation of two aims for the present study: 1) provide an understanding and definition of rapid implementation, informed by knowledge drawn from the implementation science field, and 2) demonstrate the contributions of concept analysis and integrative review, conjoining the strengths of each through this worked example.


The purpose of a concept analysis is to analyse, define, develop and evaluate ambiguous or complex concepts [18] and provide a precise definition. A number of methods have been developed to guide the analysis of a concept [19]. In a recent scoping review of concept analyses by Rodgers [20], the Wilson Method was the most commonly used (Walker & Avant, [21]; n = 465), followed by the evolutionary Method (Rodgers [22]; n = 213) and then the Principle-based Method (Morse et al. [23]; n = 47). Rodgers [20] also highlighted a possible lack of rigour, restricted scope, and failure to approach conceptual work in a systematic way in many of the papers analysed. The Wilson Method [21] has been suggested to enhance critical thinking but has been critiqued for not necessarily producing documentation of a scientific nature [24]. Yet the Principle-based Method [24] has been praised for its robust means of theoretically defining a concept and determining the state of science at any given point in time [25], whilst we found the guidelines to do this to be rather indistinct. We selected Rodgers [26] Evolutionary Concept Analysis Method because of the emphasis it gives to the examination of the quality and the degree of the concept reported in the literature. Rodgers’ [26] traditional step-by-step linear approach can be limiting, compared to a fluid three-phase evolutionary concept analysis approach, previously described by Tofthagen and Fagerstrøm [27] and Delves-Yates, Stockl [18] as consisting of: Phase 1 Initial phase; Phase 2 Core analysis; and Phase 3 Further analysis. Combining the three-phase evolutionary method of concept analysis with a systematic integrative review provides an organized process that may enhance rigour, with the systematic integrative review addressing both qualitative and quantitative studies, and enabling a more thorough, integrative review of papers covering a range of methodologies [12]. This attempt to produce a more robust and transparent process of assessing the concept of rapid implementation within the literature may lead to more useful and relevant definitions of a concept [20], with the literature in this case being used as the primary source of data [28]. Table 1 highlights how components of the phases and stages of evolutionary concept analysis were augmented by additional stages for conducting a systematic integrative review.

Table 1 Proposing an augmented three-phase framework that combines concept analysis with a systematic integrative review

We also applied an adapted version to that of Whittemore and Knafl’s [29] systematic integrative review, with a previously adapted version applied to nursing [12] and intrapersonal, interpersonal and structural barriers in goal-driven behaviours [13]. This adaption created an additional step of appraisal of the overall quality of each paper mentioning rapid implementation. This in turn allowed for: 1) description of the search strategy, 2) identification of databases searched, 3) listing inclusion/exclusion criteria, 4) data extraction 5) methodological quality, and 6) synthesis.

Phase 1 initial phase

Phase 1; stage 1: identify and name concept

The concept ‘rapid implementation’ was selected, having emerged in the literature recently, but without a precise definition.

Phase 1; stage 2: identify and select an appropriate sample for data collection

Whittemore and Knafl’s [29] systematic integrative review method was adopted for this stage. No reviews on implementation science were identified in the Database of Abstracts of Reviews of Effects (DARE) and the Cochrane Database of Systematic Reviews (CDSR). A protocol was developed that included review questions, inclusion and exclusion criteria, search strategy, study selection, data extraction, quality assessment, data synthesis, and plan for dissemination [30].

Phase 1; stage 3: identify surrogate terms and relevant uses of the concept and describe the search strategy

The surrogate terms helped form the key terms used for the search strategy alongside and guided by PICO, for example, the use of P = population, I = phenomena of interest, Co = context (P = the delay in implementing research into practice, I = rapid implementation of research, CO = the Hospital/Healthcare setting). Final search terms were reviewed by an experienced clinical librarian (JC) for the Centre for Healthcare Resilience and Implementation Science, who consulted on the search strategy and databases to use. A variation of the following key terms was searched in various combinations:

  1. 1.

    (“rapid research” or “rapid implementation”).Mp

  2. 2.

    implementation science/.

  3. 3.


  4. 4.

    (dissemination or implementation).Mp

  5. 5.

    implementation research.ti,ab.

  6. 6.

    2 or 3 or 4 or 5.

  7. 7.

    1 and 6.

Phase 1; stage 4: identify databases searched, inclusion/exclusion criteria, data extraction, methodological quality, and synthesis

The description of the databases and inclusion/exclusion criteria are shown in Table 2. Additional articles meeting the inclusion criteria were obtained through hand searching of relevant journals (see Fig. 1). Also, recognition of references for inclusion occurred when sources were cited frequently by other authors but had not been identified in the original search results (snowballing).

Table 2 Inclusion and exclusion criteria and databases
Fig. 1

PRISMA flow diagram of the screened studies

Data extraction

Figure 1 provides the study flow diagram using Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) that provides the number of articles and duplicates removed and the process of selecting the final studies. JS and SS performed the study selection by reading the title and abstracts of all studies and sequentially excluded records according to the inclusion/exclusion criteria. If the title and abstract met the inclusion criteria the full text of the articles was read to determine if inclusion criteria were met. Data relating to rapid implementation were extracted after each publication was read line-by-line. To add to the rigour of this process, the data extraction process was then repeated independently by JC who checked the process to ensure credibility and reduce personal bias.

Methodological quality (risk of bias, quality)

To facilitate inter-rater agreement, a rating format for both qualitative and quantitative studies was used. The Standard Quality Assessment Criteria for Evaluating Primary Research Papers (QualSyst) criteria we adopted was set out by Kmet et al. [31] closely followed by a more recent systematic review [32] that we used in conjunction with the PRISMA reporting guidelines. Inter-rater agreement was assessed using percent agreement (number of agreement scores divide by the total number of scores) [32].

Two authors (JS, SS) assessed methodological quality using the Standard Quality Assessment Criteria for Evaluating Primary Research Papers (QualSyst) for eligible articles [31, 32]. Quantitative studies were scored on 14 criteria, while qualitative studies were scored on 10 criteria (see Tables 3 and 4). Mixed-methods papers were scored on both criteria. Each article was given a score of 0 (not met), 1 (partially met), or 2 (met) for each criterion [31, 32]. A summary score was calculated for each study by summing scores for each criterion and dividing the total possible score, where higher scores indicated greater methodological quality [32]. Agreement for these studies was 89%. Disagreements were resolved through discussion until consensus was reached.

Table 3 Percentage of studies scoring ‘Yes’ for quality assessment criteria (Kmet et al. [31]; Collins et al. [32])
Table 4 Summary of the included studies


Systematic integrative reviews help bring data together in a descriptive thematic synthesis [13, 33]. For data evaluation, studies were reviewed, categorised and critiqued [34]. NVivo v12 plus [35], a data management tool to facilitate both the synthesis and critique process was used.


Data analysis was carried out using descriptive thematic analysis adapted from Thomas et al. [33]. Articles were read and reread and text reviewed line by line, to obtain a detailed understanding and familiarisation. Descriptive thematic analysis with iterative processes created the themes using the following approach. Significant information from the studies were coded and sub-categorised and classified into attributes, antecedents, and consequences. Once classified, the codes were reviewed for overarching themes as outlined in Table 5. JS led the analysis and consensus was reached during team meetings, where findings were critically examined and questioned by all authors.

Table 5 Themes developed from a synthesis of the literature


Phase 2: results and core analysis

Phase 2; stage 1: study characteristics and risk of bias (quality) assessment

Study characteristics

Our derived sample consisted of 24 studies selected for the period of 2003–2019: 18 studies were from the United States of America (America from here), three were from Australia, one was from Japan, one from Germany and one from the United Kingdom (U.K. from here). Authors such as Glasgow [11, 36,37,38,39,40] from America, and Braithwaite and Rapport [10, 41, 42] from Australia, frequently appeared in the literature. Some areas of clinical practice were prominent in calling for rapid research to align with the speed of progress, for example, precision medicine [43,44,45,46,47]. A diverse range of methodological approaches were taken, including qualitative, quantitative and mixed-methods.

Risk of bias (quality) assessment

Table 3 shows that the quality of studies was generally high with quality assessment scores ranging from 40 to 100% (mean QAT score is 100% for quantitative, and 59% for qualitative) confirming the findings of a recent study using the same assessment tool [32]. Although scores for the quantitative studies were high, this was not the case for qualitative studies. For example, part of the selection criteria involved the inclusion of literature reviews to enrich the information available for assessment; however, not all criteria were applicable for assessing literature reviews thus lowering quality assessment for these studies (see Table 4). While quality assessment was not a criterion for inclusion, in line with Sandelowski, et al’s [48]. study, to exclude studies based on quality appraisal could result in valuable data being lost. Instead, quality appraisal processes were used to both increase familiarity with the studies and highlight methodological rigour of studies.

Core analysis

Themes that were derived from the synthesis of the studies have been classified under attributes, antecedents and consequence, as shown in Table 5. Antecedents of rapid implementation included clinical practice areas e.g., precision medicine. Attributes of rapid implementation included adaptions to methods (current approaches, procedures and frameworks). The consequence is to bridge the gap between research and practice by re-thinking trials to produce more speedy actionable data that can be of use to practitioners and people working in the field of healthcare delivery.

Phase 2; stage 2: identify the attributes, references, antecedents, and consequences of the concept


Attributes are characteristics of the concept that make it possible to identify situations that can be characterised under the concept and constitute the essential definition of a concept [27, 49]. The defining attributes of the term rapid implementation include rapid [37, 39, 44, 50], responsive [10, 39, 51], relevant [37, 42, 52], efficient [36, 37, 44], and speedy [4, 37, 39] research findings that are produced because of more flexible designs (methods [10, 44, 53], approaches [42], procedures [37, 50], and implementation science frameworks [40, 50, 54]. Other attributes include calls to re-define research rigour [11, 37, 41, 52] which entails promoting research that is both thorough, relevant and that disseminates well into practice to increase the timeliness and applicability of research. Increasing the timeliness and applicability of research innovation, and establishing rapid learning research systems [4, 42, 51] which are considered to bring researchers, funders, practitioners, and those working in health systems together to assist in the rapid adoption of research findings in practice.


In consideration of these defining attributes, antecedents are events preceding the concept [26, 49]. The antecedents for rapid implementation are clinical practice antecedents e.g., precision medicine [43,44,45,46,47, 55], and are being viewed across the biomedical enterprise, such as molecular immunohaematology (molecular oncology) [46], molecular profiling (oncology) [45], molecular tumour boards (precision oncology) [55], and genotyping (biostatistics) [44]. These are rapidly evolving areas that require rapid deployment of actionable data. It appears that these specific clinical areas are indeed driving the concept of rapid implementation in clinical practice.


Consequences are defined as events or phenomena that result from the concept [26, 49]. The consequence of rapid implementation requires change to traditional study designs that are notoriously slow to change, with pipeline problems (efficacy, effectiveness and implementation). This includes calls for more appropriate trial designs such as basket trials (discovery-based, which can be phase I or early phase II trials), umbrella trials [43] (which can be phase II, exploratory, or proof-of-concept trials) and qualitative trials [10] all of which attempt to bridge the research to practice gap. Consequences, therefore, reflect what we know works and how to get it into practice faster, to respond to questions of practitioners or decision-makers who make decisions about health care, and who need rapid, actionable data to make those decisions.

Phase 2; stage 3: identify concepts related to the concept of interest

Rapid implementation is successful when results are used widely across healthcare settings. As a science area, precision medicine is, for example, changing the way we practice medicine and deliver healthcare by calling for faster, actionable results, and timelines to be shortened, from discovery and application in laboratories, to their recognition as standards of practice [46].

Phase 2; stage 4: identify a model case of the concept

The following defining criteria of rapid implementation is presented in an identified model case. It is intended to illustrate and help understand rapid implementation in use.

Developments in next generation sequencing and information technology have made precision medicine possible, with genetic, omics, clinical, environmental and lifestyle data now available [43]. Scientific and technological advances occur that may make ‘business as usual’ less relevant or even obsolete. Precision medicine is a disruptive innovation that holds the potential to fundamentally alter how evidence-based medicine is practiced and taught (Rushforth A, Greenhalgh T: Personalised medicine, disruptive innovation and ‘trailblazer’ guidelines: Case study and theorization of an unsuccessful change effort, forthcoming). This is at the core of what is driving real time translation at a different speed. Thus, the gulf between research and practice is affecting clinicians who need rapid, actionable data to make decisions. Acquiring research in more rapid ways suggests that practice questions could shape the research methods used, rather than the methods determining the research agenda. Rapid deployment of results means we need to redefine rigour and provide a degree of flexibility.

Proposed theoretical definition

The intent of the proposed theoretical definition is to highlight how the analysis revealed rapid implementation as a key concept. The findings indicated no clear theoretical definition at present. Drawing on our analysis we propose the following theoretical definition for rapid implementation:

Rapid implementation provides the best possible evidence-based practice of a program or intervention to those who need it, with speed and efficiency, by redefining rigour, and adapting both methods (adapting current approaches, procedures and implementation frameworks), and trial design, to fit research aims and objectives.


Phase 3; stage 1: further development of the concept

The work in this paper provides a method to increase our understanding of rapid implementation in terms of doing all types of implementation science more efficiently, with rapid implementation as an intriguing possibility to bridge the gap between research and practice and get actionable results into practice more quickly and effectively. We attempted to uncover the core concepts in the literature and synthesise the findings from papers defining themselves as involved in some respect in a rapid implementation, within the broad remit of implementation science. The concept of rapid implementation until now has been without a precise definition. The result of the study leads to a precise definition, derived from establishing the meaning, attributes, and characteristics of rapid implementation.

The theoretical definition derived from our results characterises rapid implementation as incorporating speed and efficiency, while having the ability to adapt methods and trial design to suit the needs of complex studies. The literature in this area is still in its infancy and remains largely descriptive in terms of how study design and strategies can reduce the time it takes to move evidence into practice. Our study has brought this to light, by focusing on defining rapid implementation as an emerging area of importance, and by so doing, providing a fundamental definition (‘building blocks’) of rapid implementation that is for the first time being made explicit. This not only ensures the international community can communicate more effectively within and between disciplines [9], but that research results have the potential to be more valid and reliable. Having a standard definition of rapid implementation may make it possible for research to replicate effective interventions [7] and shape future research to improve the evidence-base.

Links across themes

The fundamental basis of rapid implementation is clinical practice – with its need for fast information on which to base good clinical decisions. It is exemplified here by precision medicine, which is amongst those areas leading the field in the application of rapid implementation approaches and ideas [43,44,45,46, 55]. Attributes of rapid implementation (adapting methods, procedures, and frameworks) are challenging traditional implementation and the consequence is that rapid implementation can help bridge time-gaps between research and practice (working on, for example, research practice and clinical policy simultaneously, or re-thinking and shortening the length of trials). In particular, antecedents that relate to clinical practice areas and attributes of rapid implementation that challenge traditions are of particular significance, appreciating the bidirectional relationship between practice and research. The clinical practice studies retrieved discussed science areas such as genomics that have evolved quickly within the precision medicine paradigm [43,44,45,46,47, 55] and suggested that these areas require research findings to be produced quickly, to provide recommendations, so that a patient can be treated in a timely way. These efforts have now included calls for more appropriate trial designs such as basket trials or umbrella trials [43] and qualitative trials [10]. Ways to address this problem can be found within the core attributes of rapid implementation: to feedback information and findings more quickly to clinical practice. The research literature also calls for a redefinition of rigour in undertaking a rapid implementation study as part of the implementation science agenda, along with the view that rapid learning research systems need to be deployed to ensure that research can meet the time-pressured demands of clinical practice [56]. In this respect, there is broad agreement amongst researchers and practitioners that there needs to be a common cause to support the rapid implementation of research findings into clinical practice. Harnessing rapid learning research systems and precision medicine models of care together may foster greater stakeholder collaborations, encouraging greater integration between researchers, funders, health systems workers, practitioners, and community partners, focussing on time-pressured, clinically relevant questions [39].

Antecedents of clinical practice areas

Clinical practice areas within the broad remit of precision medicine [43], such as molecular oncology [46], molecular profiling [45], molecular tumour boards [55], and genotyping [44] all require rapid implementation, creating urgency for implementation science to research the most effective ways to inform how we create those changes. Churruca et al. [42] discussed genomics and the role of evidence within implementation science, highlighting why evidence slowly and only inconsistently makes its way into practice [3]. In support, Peek et al. [52] argued that it was unacceptable that only 14% of funded research made its way through the minefields of uptake into practice [3]. Putting this into perspective, precision medicine’s emerging technologies have evolved so much in the time it takes to implement change in real-time practice that the original protocols developed ahead of the subsequent research findings can be redundant.

Attributes at the core of rapid implementation

Research challenging traditional implementation is essentially about research being more responsive. We need to transition from traditional implementation towards more sustainable, rapid implementation. Rapid research must provide actionable results and scientific rigour, discussed by Rapport and Braithwaite [10] and Peek et al. [52]. To uphold rigour, we need to redefine it to reflect the needs of a range of stakeholders (for example; practitioners, decision-makers, and policy makers), to reflect a more pragmatic approach to research. Peek et al. [52] suggest current conceptions of rigour do not allow for this and limit the range of real-world situations where intervention implementation is feasible. Striking a balance between rigour, rapidity and flexibility of methods and procedures [55] is difficult however to achieve [37, 50].

In redefining rigour, we must be mindful that research aims and objectives should determine the research methods rather than the methods driving the research agenda [57]. If contexts and needs require rapid implementation, then current methods must be adapted [58]. To help understand the mechanisms and contexts of implementation, researchers are exploring generating speedy actionable findings through mobile methods [10, 59], case studies [42, 53], and the transition from traditional ethnographic methods to rapid ethnography [60] to inform rapid improvements to healthcare. These initiatives are part of the overarching shift towards rapid implementation science by researching the most efficient ways to implement evidence [52].

We must also reassess how we manage data. Some studies have introduced ‘rapid analysis’ [50], defined as adapting procedures to produce speedy, efficient and valid findings, as well as providing timely information of value to stakeholders (practitioners, patients, families, decision-makers, administrator and policy makers) [37]. This may also mean adjusting recruitment processes and survey procedures to enhance participation rates [50].

A structural mechanism for progress is becoming known as ‘rapid learning research systems’ [39]. Churruca et al. [42] presented case studies that explored a rapid learning research system in the field of genomics and suggested a new approach recommending that implementation scientists be embedded within the very fabric of the healthcare system with the implementation scientist being viewed as one of the team. Guthrie et al. [53] presented case studies showing a number of different actors (practitioners, surgeons, policy makers) playing a role in bridging the gap between research and practice. For Churruca et al. [42] this can build social capital by sharing knowledge with, for example, local clinical and laboratory genomics researchers. In rapid learning research systems, stakeholder (researchers, practitioners and surgeons) roles are more equalised, and partnerships are emphasised [37]. The transformation to a rapid learning research system will require a concerted effort by research funders, academic institutions, healthcare systems, researchers, and a variety of practice, community, and policy stakeholders to evoke the culture shift in how people work and how research is co-created collaboratively.

There is help at hand, however. Multiple implementation science frameworks have increased potential for rapid uptake, such as: the Consolidated Framework for Implementation Research (CFIR) [50, 54], My Own Health Report (MOHR) [37], Practical, Robust, Implementation and Sustainability Model (PRISM) [40] and Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) [4, 40]. CFIR was used in the majority of the studies we reviewed [50, 54] and was an indirect focus by one other study [4]. CFIR is focused on components of system change and is intended to be flexible, so that researchers can tailor the framework to the specific intervention design, factors, and context being studied and is useful in guiding rapid-cycle evaluation of the implementation of practice transformation initiatives [50]. Gale et al. [50] carried out a rapid process evaluation guided by CFIR, completed within 12 months. Beck et al. [40] applied PRISM, combining it with RE-AIM components. The combination created a tailored implementation plan for Twenty-First Century well-child care, facilitating the implementation process and improving success in spreading and sustaining care models in paediatric practices. Battaglia and Glasgow [4] discussed RE-AIM as a framework for validating measures of reach, effectiveness, adoption, implementation, and maintenance, and raising the importance of sustainability as a key implementation outcome. Other examples applying rapid logic include MOHR [37], a practice-level, cluster randomized pragmatic implementation study designed to develop fast, actionable evidence around the use of patient-reported measures in patient care.

Narrowing the gap - consequences leading to bridging the gap between research and practice

From the foregoing it is clearly important to bridge the gap between research and practice to ensure implementable interventions are current, relevant and applicable to real-time practice – encouraging uptake and ensuring it becomes established [4, 42, 50]. Randomised controlled trials (RCTs) are everywhere, but uncertainties exist in how rapid implementation could be addressed by the classic RCT [37, 52, 57]. Presently, evidence indicates that randomized efficacy trials take approximately five and a half years from the initiation of enrolment to publication, and seven years or more after adding in the time from grant application submission to enrolment initiation [53, 58, 61]. In the real-world environment of clinical practice, this time-lag is unacceptable, as well as impractical in a study on, say, a rare disease or the pandemic outbreak of COVID-19 [53, 58, 62]. An ethos of rapid implementation can help challenge the current static notion of good science (following a laborious, pipeline efficacy-effectiveness-implementation logic) where too much good science falls behind. The pipeline model suggests efficacy studies precede effectiveness or implementation research, and yet efficacy trials are often not relevant and are sometimes inversely related to those associated with success in later stages [11]. As a result, we often see a ‘voltage drop’ [4] (reduced fidelity of the intervention when disseminated to other settings), lack of guidance in tailoring interventions to the local context, and all-too-often, inadequate resources being made available for implementation. Clinical trials need to be more pragmatic; open to a range of methods, as necessary, to address a research question [11, 63], and able to address questions that are relevant to the multiple stakeholders involved. Comparing real-world alternatives, such as qualitative trials [10], basket trials or umbrella trials [43] can potentially be used as alternatives to the classic ‘placebo, no treatment, or control’. Bando [43] indicates, when considering drug testing, that it is important to have a trial design that can efficiently distribute targeted drugs and suggests umbrella and basket designs. Other authors such as Glasgow and Chambers [64] propose a blending of the efficacy and effectiveness stages of intervention development to improve the speed of knowledge creation and increase the usefulness and policy relevance of clinical research. Blending effectiveness and implementation research together has been referred to as the hybrid effectiveness-implementation typology [4, 11, 42]. The idea suggests rapid implementation research designs will hasten the movement of interventions from effectiveness testing through to implementation [62]. As proposed by Raine et al. [57] rapid implementation is moving past the classic large-scale multicentre Randomised Control Trials (RCTs) and towards implementing a broad menu of rapid methods. This evidence adds further support to our findings [10, 42, 43, 59, 60]. Our study should not be viewed as an endpoint, but as increasing understanding of rapid implementation and providing clarity for the next step in our field, that is, placing greater focus on applying and/or adapting rapid methods in implementation science and consideration on what future challenges and opportunities this may present.

Phase 3; stage 2: strengths and limitations

To our knowledge, this new method combination (concept analysis and systematic integrative review) is introduced for the first time in this study, to assure analytical depth, rigour and replicability. However, by excluding non-English language studies, insights may remain limited. The process of applying the augmented three-phase framework that combined concept analysis with a systematic integrative review resulted in a robust process that enhanced the quality and transparency of the data produced. A clear conceptual definition of rapid implementation is now available and supports international work to implement more rapidly actionable results in clinical practice. Precision medicine is still in its infancy, but it holds the potential to fundamentally alter how evidence-based medicine is practiced and taught. We see the possibilities where rapid implementation meets precision medicine as potentially providing demonstrations needed at the intersection of precision medicine and rapid learning research systems. By way of finalising the discussion, we note the lack of research involving the patient’s voice, as well as the need to involve patients as additional stakeholders in implementation science research generally and particularly within rapid implementation. This points to the need for future research in this area [65].


While rapid implementation is, in some ways, evolutionary, in other ways, it is revolutionary. There are new methods potentially dislodging current methods; longstanding theories and methods of doing research are being adapted and reconfigured, with many stakeholders (e.g., practitioners, patients, families, decision-makers, administrator and policy makers) recognising the need for faster answers to get results into practice more speedily, thereby negating criticisms of standard implementation delays and the research-practice gap. If it can be made to work, the future of rapid implementation informed by implementation science is bright. It can help efficiently integrate science into practice using contextual and systems perspectives, focusing on adaption, pragmatic trials and mixed methods, and engendering a degree of flexibility in data assessment and interpretation. The key remaining question is how far and fast can we go?

Availability of data and materials

Not applicable. All relevant data are within the article.



United States of America


Cochrane Database of Systematic Reviews


Consolidated Framework for Implementation Research


Database of Abstracts of Reviews of Effects


My Own Health Report


P = population, I = phenomena of interest, Co = context


Practical, Robust, Implementation and Sustainability Model


Preferred Reporting Items for Systematic Reviews and Meta Analyses


Quality Assessment Tool


Standard Quality Assessment Criteria for Evaluating Primary Research Papers


Reach, Effectiveness, Adoption, Implementation, and Maintenance


Randomised Control Trials


United Kingdom


  1. 1.

    Breitenstein SM, Gross D, Garvey CA, Hill C, Fogg L, Resnick B. Implementation fidelity in community-based interventions. Res Nurs Health. 2010;33(2):164–73.

    PubMed  PubMed Central  Google Scholar 

  2. 2.

    Eccles MP, Mittman BS. Welcome to Implementation Science. Implement Sci. 2006;1(1):1.

  3. 3.

    Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;9(1).

  4. 4.

    Battaglia C, Glasgow RE. Pragmatic dissemination and implementation research models, methods and measures and their relevance for nursing research. Nurs Outlook. 2018;66(5):430–45.

    PubMed  Article  Google Scholar 

  5. 5.

    Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

    PubMed  PubMed Central  Article  Google Scholar 

  6. 6.

    Mikkelsen G, Frederiksen K. Family-centred care of children in hospital – a concept analysis. J Adv Nurs. 2011;67(5):1152–62.

    PubMed  Article  Google Scholar 

  7. 7.

    Abraham C, Michie S. A taxonomy of behavior change techniques used in interventions. Health Psychol. 2008;27(3):379.

    PubMed  Article  Google Scholar 

  8. 8.

    McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of babel? Implement Sci. 2010;5:16.

    PubMed  PubMed Central  Article  Google Scholar 

  9. 9.

    Bell AF, Lucas R, White-Traut RC. Concept clarification of neonatal neurobehavioural organization. J Adv Nurs. 2008;61(5):570–81.

    PubMed  PubMed Central  Article  Google Scholar 

  10. 10.

    Rapport F, Braithwaite J. Are we on the cusp of a fourth research paradigm? Predicting the future for a new approach to methods-use in medical and health services research. BMC Med Res Methodol. 2018;18(1):131.

    PubMed  PubMed Central  Article  Google Scholar 

  11. 11.

    Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin Transl Sci. 2012;5(1):48–55.

    PubMed  PubMed Central  Article  Google Scholar 

  12. 12.

    Jackson D, Hickman LD, Hutchinson M, Andrew S, Smith J, Potgieter I, et al. Whistleblowing: an integrative literature review of data-based studies involving nurses. Contemp Nurse. 2014;48(2):240–52.

    PubMed  Article  Google Scholar 

  13. 13.

    Smith J, Smith S. The constraint-effects-mitigation involvement loyalty model: an integrative review. Leis Sci. 2017;39(3):244–60.

    Article  Google Scholar 

  14. 14.

    Brush BL, Kirk K, Gultekin L, Baiardi JM. Overcoming: a concept analysis. Nurs Forum. 2011;46(3):160–8.

    PubMed  PubMed Central  Article  Google Scholar 

  15. 15.

    Helvig A, Wade S, Hunter-Eades L. Rest and the associated benefits in restorative sleep: a concept analysis. J Adv Nurs. 2016;72(1):62–72.

    PubMed  Article  Google Scholar 

  16. 16.

    Liu Y, Avant KC, Aungsuroch Y, Zhang X-Y, Jiang P. Patient outcomes in the field of nursing: a concept analysis. Int J Nurs Sci. 2014;1(1):69–74.

    Google Scholar 

  17. 17.

    Goodnite PM. Stress: a concept analysis. Nurs Forum. 2014;49(1):71–4.

    PubMed  Article  Google Scholar 

  18. 18.

    Delves-Yates C, Stockl A, Moore J. Making sense of concept analysis. Nurse Res. 2018;25(4):43–6.

    PubMed  Article  Google Scholar 

  19. 19.

    Nevin M, Smith V, Hynes G. Non-specialist palliative care: a principle-based concept analysis. Palliat Med. 2019;33(6):634–49.

    PubMed  Article  Google Scholar 

  20. 20.

    Rodgers BL, Jacelon CS, Knafl KA. Concept analysis and the advance of nursing knowledge: state of the science. J Nurs Scholarsh. 2018;50(4):451–9.

    PubMed  Article  Google Scholar 

  21. 21.

    Walker LO, Avant KC. Strategies for theory construction in nursing; 2005.

    Google Scholar 

  22. 22.

    Rodgers BL. Concept analysis: an evolutionary view. 2 ed. B. L. Rodgers KAKE, editor. Philadelphia: PA: Saunders; 2000.

    Google Scholar 

  23. 23.

    Morse JM, PhD CM, Hupcey JE, Cerdas M. Criteria for concept evaluation. J Adv Nurs. 1996;24(2):385–90.

    CAS  PubMed  Article  Google Scholar 

  24. 24.

    Hupcey JE, Penrod J. Concept analysis: examining the state of the science. Research Theory for Nursing Practice. 2005;19(2):197.

    PubMed  Article  Google Scholar 

  25. 25.

    O'Malley D, Higgins A, Smith V. Postpartum sexual health: a principle-based concept analysis. J Adv Nurs. 2015;71(10):2247–57.

    PubMed  Article  Google Scholar 

  26. 26.

    Rodgers BL. Concepts, analysis and the development of nursing knowledge: the evolutionary cycle. J Adv Nurs. 1989;14(4):330–5.

    CAS  PubMed  Article  Google Scholar 

  27. 27.

    Tofthagen R, Fagerstrøm LM. Rodgers’ evolutionary concept analysis–a valid method for developing knowledge in nursing science. Scand J Caring Sci. 2010;24:21–31.

    PubMed  Article  Google Scholar 

  28. 28.

    Morse JM. Exploring the theoretical basis of nursing using advanced techniques of concept analysis. Adv Nurs Sci. 1995.

  29. 29.

    Whittemore R, Knafl K. The integrative review: updated methodology. J Adv Nurs. 2005;52(5):546–53.

    PubMed  Article  Google Scholar 

  30. 30.

    UoYcf Reviews. Dissemination, systematic reviews: CRD's guidence for undertaking reviews in health care. University of York: Centre for Reviews & Dissemination; 2009.

  31. 31.

    Kmet LM, Lee RC, Cook LS. Standard quality assessment criteria for evaluating primary research papers from a variety of fields. Edmonton: Alberta Heritage Foundation for Medical Research; 2004.

  32. 32.

    Collins DE, Ellis SJ, Janin MM, Wakefield CE, Bussey K, Cohn RJ, et al. A systematic review summarizing the state of evidence on bullying in childhood Cancer patients/survivors. J Pediatr Oncol Nurs. 2019;36(1):55–68.

    PubMed  Article  Google Scholar 

  33. 33.

    Thomas J, Harden A, Oakley A, Oliver S, Sutcliffe K, Rees R, et al. Integrating qualitative research with trials in systematic reviews. BMJ. 2004;328(7446):1010–2.

    PubMed  PubMed Central  Article  Google Scholar 

  34. 34.

    Cooper H. The integrative research review: a systematic approach. Beverly Hills: Sage; 1984.

    Google Scholar 

  35. 35.

    QSR International Pty Ltd. NVivo qualitative data analysis software version 12 plus. Victoria: QSR International Pty Ltd; 2016.

    Google Scholar 

  36. 36.

    Glasgow R, Lichtenstein E, Marcus A. Why don't we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.

    PubMed  PubMed Central  Article  Google Scholar 

  37. 37.

    Glasgow RE, Kessler RS, Ory MG, Roby D, Gorin SS, Krist A. Conducting rapid, relevant research: lessons learned from the my own health report project. Am J Prev Med. 2014;47(2):212–9.

    PubMed  PubMed Central  Article  Google Scholar 

  38. 38.

    Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):1274.

    PubMed  PubMed Central  Article  Google Scholar 

  39. 39.

    Riley WT, Glasgow RE, Etheredge L, Abernethy AP. Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise. Clin Transl Med. 2013;2(1):10.

    PubMed  PubMed Central  Article  Google Scholar 

  40. 40.

    Beck A, Bergman DA, Rahm AK, Dearing JW, Glasgow RE. Using implementation and dissemination concepts to spread 21st-century well-child Care at a Health Maintenance Organization. Permanente J. 2009;13(3):10–8.

    Article  Google Scholar 

  41. 41.

    Rapport F, Clay-Williams R, Churruca K, Shih P, Hogden A, Braithwaite J. The struggle of translating science into action: foundational concepts of implementation science. J Eval Clin Pract. 2018;24(1):117–26.

    PubMed  Article  Google Scholar 

  42. 42.

    Churruca K, Ludlow K, Taylor N, Long JC, Best S, Braithwaite J. The time has come: embedded implementation research for health care improvement. J Eval Clin Pract. 2019;25(3):373–80.

    PubMed  Article  Google Scholar 

  43. 43.

    Bando H. The current status and problems confronted in delivering precision medicine in Japan and Europe. Curr Probl Cancer. 2017;41(3):166–75.

    PubMed  Article  Google Scholar 

  44. 44.

    Basu S, Zhang YW, Ray D, Miller MB, Iacono WG, McGue M. A rapid gene-based genome-wide association test with multivariate traits. Hum Hered. 2013;76(2):53–63.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  45. 45.

    Birendra KC, Afzal MZ, Sochaki A, Wentland KA, Chang R, Singh S, et al. Tumor molecular profiling in the treatment of refractory cancers. J Exp Ther Oncol. 2015;11(1):27–32.

    CAS  PubMed  Google Scholar 

  46. 46.

    Denomme GA, Flegel WA. Applying molecular immunohematology discoveries to standards of practice in blood banks: now is the time. Transfusion. 2008;48(11):2461–75.

    PubMed  Article  Google Scholar 

  47. 47.

    Francescatto L, Katsanis N. Newborn screening and the era of medical genomics. Semin Perinatol. 2015;39(8):617–22.

    PubMed  PubMed Central  Article  Google Scholar 

  48. 48.

    Sandelowski M, Docherty S, Emden C. Qualitative metasynthesis: issues and techniques. Res Nurs Health. 1997;20(4):365–71.

    CAS  PubMed  Article  Google Scholar 

  49. 49.

    Foley AS, Davis AH. A guide to concept analysis. Clin Nurse Spec. 2017;31(2):70–3.

    PubMed  Article  Google Scholar 

  50. 50.

    Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the veterans health administration. Implement Sci. 2019;14(1):11.

    PubMed  PubMed Central  Article  Google Scholar 

  51. 51.

    Kilbourne AM, Rani Elwy A, Sales AE, Atkins D. Accelerating research impact in a learning health care system VA's quality enhancement research initiative in the choice act era. Med Care. 2017;55(7):S4–S12.

    PubMed  Article  Google Scholar 

  52. 52.

    Peek CJ, Glasgow RE, Stange KC, Klesges LM, Peyton Purcell E, Kessler RS. The 5 r’s: an emerging bold standard for conducting relevant research in a changing world. Ann Fam Med. 2014;12(5):447–55.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  53. 53.

    Guthrie S, Grant J, Hanney S, Pollitt A. Investigating time lags and attribution in the translation of cancer research: A case study approach. Cambridge: RAND Europe; 2014.

  54. 54.

    Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EFJIS. Using the consolidated framework for implementation research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci. 2017;12(1):15.

    PubMed  PubMed Central  Article  Google Scholar 

  55. 55.

    Burkard ME, Deming DA, Parsons BM, Kenny PA, Schuh MR, Leal T, et al. Implementation and clinical utility of an integrated academic- community regional molecular tumor board. JCO Precision Oncol. 2017;2017(1):1–10.

    Google Scholar 

  56. 56.

    Chambers DA. Commentary: increasing the connectivity between implementation science and public health: advancing methodology, evidence integration, and sustainability. Annu Rev Public Health. 2018;39(1):1–4.

    PubMed  Article  Google Scholar 

  57. 57.

    Raine R, Fitzpatrick R, Barratt H, Bevan G, Black N, Boaden R, et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Services and Delivery Research. 2016;4(16):215–16.

    Article  Google Scholar 

  58. 58.

    Wilder-Smith A, Chiew CJ, Lee VJ. Can we contain the COVID-19 outbreak with the same measures as for SARS? The lancet infectious diseases; 2020.

    Google Scholar 

  59. 59.

    Rapport F, Auton E, Cartmill J, Braithwaite J, Shih P, Hogden A, et al. Fit for purpose? OrganisationaL prOdUctivity and woRkforce wellbeIng in workSpaces in hospital (FLOURISH): a multimethod qualitative study protocol. BMJ Open. 2019;9(4):e027636.

    PubMed  PubMed Central  Article  Google Scholar 

  60. 60.

    Reeves S, Peller J, Goldman J, Kitto S. Ethnography in qualitative educational research: AMEE guide no. 80. Med Teach. 2013;35(8):e1365–e79.

    PubMed  Article  Google Scholar 

  61. 61.

    Randhawa GS, Soltysiak MP, El Roz H, de Souza CP, Hill KA, Kari L. Machine learning using intrinsic genomic signatures for rapid classification of novel pathogens: COVID-19 case study. PLoS One. 2020;15(4):e0232391.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  62. 62.

    Curran MG, Bauer MM, Mittman MB, Pyne MJ, Stetler MC. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    PubMed  PubMed Central  Article  Google Scholar 

  63. 63.

    Rapport F, Storey M, Porter A, Snooks H, Jones K, Peconi J, et al. Qualitative research within trials: developing a standard operating procedure for a clinical trials unit. Trials. 2013;14(1):54.

    PubMed  PubMed Central  Article  Google Scholar 

  64. 64.

    Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous. Rapid Relevant Sci. 2012;5(1):48–55.

    Google Scholar 

  65. 65.

    Bergerød IJ, Braut GS, Wiig S. Resilience from a stakeholder perspective: the role of next of kin in Cancer care. J Patient Saf. 2018.

Download references


Not applicable.


The combined concept analysis with a systematic integrative review study was funded by the NSW Health. This manuscript presents independent research commissioned by NSW Health. Paediatrio Limited is a non for profit cooperative joint venture between Sydney Children’s Hospitals Network, Children’s Medical Research Institute, and Children’s Cancer Institute established with the support of the NSW Government to coordinate and integrate paediatric research. Paediatrio is also affiliated with The University of Sydney and UNSW Sydney. The views and opinions expressed by authors in this publication are those of the authors and do not necessarily reflect those of NSW Health. The funder had no involvement in: the collection, analysis, and interpretation of data; in the writing of the report; and in the decision to submit the article for publication.

Author information




JS is the guarantor for this study. JS, JB, FR and TO, conceived the study. JS and SS designed the study, which was conducted by JS and SS with assistance from JC. JS, SS, JB, FR, TO, VT, EM, JL, HG helped interpret the results. JS drafted the manuscript with editing advice and support from JB and FR. All authors edited the manuscript for important intellectual content and approved the final manuscript.

Authors’ information

Not applicable.

Corresponding author

Correspondence to James Smith.

Ethics declarations

Ethics approval and consent to participate

Not applicable. The combined concept analysis with a systematic integrative review did not require ethical approval.

Consent for publication

Not applicable.

Competing interests

All authors have completed the ICMJE uniform disclosure form at JB reports grants from NSW Health, during the conduct of the study. The other authors declare that there are no relevant conflicts of interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Smith, J., Rapport, F., O’Brien, T.A. et al. The rise of rapid implementation: a worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Serv Res 20, 449 (2020).

Download citation


  • Concept analysis
  • Implementation
  • Implementation science
  • Healthcare
  • Rapid implementation
  • Systematic integrative review