Skip to main content
  • Research article
  • Open access
  • Published:

Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

Abstract

Background

Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual.

Methods

Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'.

Results

On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60) or direct phone and email contact (10/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts.

Conclusion

Normalization Process Theory has been developed through transparent procedures at each stage of its life. The theory has been shown to be sufficiently robust to merit formal testing. This project has provided a user friendly version of NPT that can be embedded in a web-enabled toolkit and used as a heuristic device to think through implementation and integration problems.

Peer Review reports

Background

Recent years have seen steadily more sophisticated approaches to the evaluation of complex interventions and technological innovations in health care. In particular, evaluation frameworks like that proposed by the UK Medical Research Council have emphasized the need to understand the complex components and contingent underpinnings of outcomes studies, especially clinical trials [1, 2]. At the same time, there have been calls for theory-driven approaches to such work [3, 4]. Theories are valuable in such work not because they provide clear and unambiguous solutions to outcomes problems, but because they can provide robust, generic, and transferable explanations of the processes that shape these outcomes. They perform the further useful function of making transparent the assumptions of researchers and others that underpin research questions, methodology, and explanations [5, 6].

Normalization Process Theory (NPT) [7], and its predecessor, the Normalization Process Model [8, 9] provides a conceptual framework to assist in understanding and explaining the dynamic processes that are encountered during the implementation of complex interventions and technological or organizational innovations in healthcare.

Robust social science theories already explain some important features of implementation and integration processes: individual differences in attitudes and intentions in relation to new technologies and practices (e.g. Theory of Planned Behavior [10]), the flow of innovations through social networks (e.g. Diffusion of Innovations Theory [11]), and reciprocal interactions between people and artifacts (e.g. Actor Network Theory [12]). NPT differs from these theories because it offers an explanatory model of the routine embedding of a classification, artefact, technique or organizational practice in everyday work. NPT focuses on the agentic contribution--the things that people do--of individuals and groups. It thus explains phenomena not well covered by existing theories.

NPT was initially developed as an applied theoretical model to assist clinicians and researchers to understand and evaluate the factors that promote and inhibit the routine incorporation of complex healthcare interventions in practice. It started from a set of empirical generalizations derived from secondary analyses of qualitative data collected in a wide variety of studies of complex interventions in healthcare. This resulted in the original constructs of the model [8]. The further empirical applications of the model showed that while it could explain factors that promote and inhibit collective action, how participants came to engage and support the practice and how they reflected on and evaluated it remained unexplained. Through the development of further constructs, accounting for how people make sense of a practice, participate in it and appraise what they do, the model became a theory. Over the past four years it has been developed as a middle-range theory of socio-technical change [7], which characterizes the mechanisms involved in the embedding of social practices within their immediate and broader social contexts.

The starting point of NPT is that to understand the embedding of a practice we must look at what people actually do and how they work [7]. NPT focuses on four theoretical constructs, which characterize mechanisms that are energized by investments made by participants.

  1. (i)

    Processes of individual and communal sense making that promote or inhibit the coherence of a complex intervention to its users. These processes are driven by investments of meaning made by participants.

  2. (ii)

    Processes of cognitive participation that promote or inhibit users' enrolment and legitimation of a complex intervention. These processes are driven by investments of commitment made by participants.

  3. (iii)

    Processes of collective action that promote or inhibit the enacting of a complex intervention by its users. These processes are driven by investments of effort made by participants.

  4. (iv)

    Processes of individual and communal reflexive monitoring that promote or inhibit users' comprehension of the effects of a complex intervention. These processes are driven by investments in appraisal made by participants.

These mechanisms, and their underpinning investments, are constrained (and released) by the operation of norms (notions of how beliefs, behaviours, and actions should be accomplished); and conventions (how beliefs, behaviours, and actions are practically accomplished). In this context, mechanisms, investments, and constraints form processes of organized, dynamic, and contingent interaction between: agents (the individuals or groups that interact in encounters around a practice); objects (the classifications, artifacts, practices and procedures employed by agents); and contexts (the technical and organizational structures in which agents and objects are implicated) [8, 9]. The primary focus of NPT is therefore the analysis of social action. As far as possible, its central constructs and their dimensions refer to observable social mechanisms [13–15] that shape the practical workability and integration of some complex intervention or technology. For health services researchers interested in process evaluation NPT provides a verifiable and empirically grounded model of the operation of factors that promote and inhibit the routine incorporation of interventions in everyday practice. For social scientists, NPT provides a well characterized middle-range theory of socio-technical change.

Although it is a relatively new theory, it has been used to:

  • inform the development and evaluation of complex clinical and organizational interventions for mental health care [16, 17]

  • examine the work processes entailed in implementing treatment regimes into patients' routines [18]

  • inform evaluations of treatment modalities in cancer [19], and diabetes [20].

  • aid understanding of the findings of randomised controlled trials for psychosocial distress and nurse-led clinics for heart failure treatment [21], chronic constipation [22] and collaborative care for depression [23]

  • inform the redesign of primary care mental health services [24] and self-management training packages [25].

  • support the development and application of decision-support tools [26] and inform a systematic review of evidence about their utilization [27]

  • aid understanding of the implementation of telecare and e-health systems in a wide variety of contexts [28–38]

Theories of all kinds are formed through complex interpretive processes that lead to inherently abstract products. Abstraction is, in fact, a necessary condition of a theory, since it must be sufficiently context-independent to be applicable to the range of relevant cases that it might be required to explain [39]. The problem that users of a theory face, then, is translating the theory from its abstract context-independent form into a form that can be used to solve problems in everyday settings. NPT is no exception. Our aim in the work reported here, therefore, has been to translate NPT's constructs into a set of statements that can be used by managers, clinicians, and researchers to work through problems of design and implementation in relation to complex interventions and new health technologies. These simplified constructs were translated into a set of statements that form the basis of a toolkit http://www.normalizationprocess.org for clinicians, managers and policy-makers interested in utilizing NPT in their work.

The purpose of this simplification work was to develop a set of generic statements that could be configured as the 'front end' of a web enabled toolkit for users of NPT. For this reason, we sought engagement and critique from NPT's user communities (Health Services Researchers, Clinical Researchers, and Social Scientists). The co-production of theories is normal in large scale investigations in the natural sciences but is much less common in the social and behavioural sciences. In such circumstances, peers are usually asked to test theories rather than collaborate in defining the means by which they are operationalized. We have sought to be as transparent as possible in the generation of the theory, and as inclusive as possible in its operationalization and stabilization in practice. Our view is that this continuous 'road testing' of basic constructs and components of the theory has done more than ensure construct validity. It has ensured that the theory is relevant to its users. In this paper we present a simplified set of 16 statements that express key elements of NPT but which can be applied without a detailed knowledge of the underlying theory. However, we must also offer a caveat. Our objective in this work was to simplify a set of theoretical constructs for heuristic purposes, and not to develop a set of validated questions that could be immediately embedded in quantitative research instruments or qualitative interview schedules. The purpose of this paper is to make transparent the process by which the 16 statements and explanations were generated, and thus be clear about the foundation of the claims we make about them.

Methods

Understanding how NPT was applied by users to real-world problems

Prior to the idea for the toolkit emerging, we sought to better understand the ways that potential users of NPT could apply it to real world problems. Between 2006 and 2009 we engaged with multiple potential users.

Engaging potential users included presentations to researchers and practitioners that linked NPT's core constructs to practical research and development problems. It also included open workshops and master-classes for researchers and practitioners interested in NPT in the UK, Australia, Canada, and the US, in addition to individual correspondence and discussion with both experienced senior and neophyte researchers interested in employing NPT in their work. These encounters provided us with an opportunity to explore the views of NPT's potential users and their critiques of both its core assumptions and constructs and of the ways that these were presented. Some potential users were sceptical, arguing that NPT offered no advantage over the Theory of Planned Behavior [10] because its predictive value was unknown, and others that it was incompatible with Actor-Network Theory [12] because of its insistence on explanation over description.

At the same time, we closely engaged with critical actual users of NPT. This included work to stabilize the constructs of the theory that we have described elsewhere [40], apply them in practice to statement development for surveys, systematic reviews and qualitative investigations [41, 42] and to define appropriate ways to apply the theory. We did this through the medium of meetings of a Peer Learning Set funded by the UK National Institute of Health Research, and personal communications with researchers using NPT in existing studies [17, 23, 25, 26, 34]. We used the group of actual users to help identify the sources of ambiguity and complexity in users' experiences of the theory. It is through our engagement with these actual users that the idea of simplifying the abstract constructs and developing a tool kit first emerged.

Translating abstract constructs into simple statements

Our second task - and the topic of this paper - was to translate the abstract constructs of the theory into their simplest possible statements, drawing, in part, on the experience we gained during the process of presenting NPT to potential and actual users. This is a process analogous, but not identical, to statement development in questionnaire design, and it rests on rigorous construct validation. We divided it into three sequential tasks.

  • We distilled each construct to a single statement of no more than two sentences. These identified the underlying social mechanism (Coherence, Collective Action, etc), explained what factors this mechanism shaped (sense-making, enacting, etc), and specified the social investments that energized it (meaning, effort, etc). This led to four construct explanations.

  • We met as a group and spent two days reducing each of the components of the four constructs to a single sentence that described what people do when they act in relation to them. This led to 16 component explanations.

  • We then constructed a set of 16 statements that expressed each component as a single context-independent statement that could be addressed to participants in an implementation-integration process. This led to 16 component statements.

These statements and explanations were 'road tested' in seminars at the Mayo Clinic (Rochester, Minnesota, US) and Dundee University (Scotland, UK) in April and May 2010. On 1 June 2010 we sent the statements and explanation (See Additional File 1, first column, 'Original Statement and Construct Explanation'). Participants in this process were selected according to criterion sampling. One of us (CRM) had kept an archive of NPT related emails and other correspondence since 2004 and this formed the sampling frame from which participants were selected. The sampling criterion was that participants appeared to be sufficiently familiar with NPT to comment on attempts to translate its constructs into plain language. We invited 60 researchers to take part and they belonged to four categories: Medicine (n = 18), Nursing and Midwifery (n = 16), Professions Allied to Medicine (n = 3), and Health Services Research and Social Science (n = 23).

Respondents were asked to feedback using an on-line pro forma composed of a series of open ended questions constructed using SurveyMonkeyâ„¢ (a proprietary on-line survey tool), and described in Additional File 2. The duration of this exercise was 21 days. We also invited members of the criterion sample to snowball the on-line form to members of their research groups and to other interested colleagues. Participants were asked to identify themselves by name and email address so that we could distinguish between those recruited directly and those who had copies forwarded to them as part of the snowball. We sent a single email reminder on 8 June 2010.

Data collected in this process took the form of short free text entries typed directly into the survey monkey pro-forma by respondents. Free text entries consisted of specific comments about items and statements, and more broadly focused comments about what respondents understood the value and limits of the toolkit to be. The comments about items and statements were extracted and then aggregated according to the item to which they referred in a matrix, or framework [43]. This provided a basis for subsequent work to improve the clarity and fidelity of each statement. We treated the comments about the value and limits of the toolkit as attributive statements and analysed them using a simple and descriptive thematic analysis [44].

Road testing the web-enabled tool

The final component of this work was to embed improved and edited statements and explanations into a web-enabled tool (available at http://www.normalizationprocess.org between August 2010 and July 2013) and to invite users to apply the tool in practice and comment on it. We already had some experience of designing web-enabled tools [45]. We released the web-enabled tool on 26 July 2010, sending a URL link and invitation to researchers who had responded to our earlier on-line questionnaire, and inviting them to snowball the URL to interested colleagues. 0 We also made a single announcement on Twitter.com and CRM's personal web-page at academia.edu, again for the purposes of snowballing.

Participants in this phase of our work were asked to work through an implementation problem using slide bars to give a subjective score to each of the statements embedded in it (an example of these, see Figure 1), and to interpret the results of this work through a set of radar plots (see Figure 2). One of us (CRM) also field-tested tested the tool with 30 participants at a meeting at the Faculty of Health and Social Development, University of Victoria, British Columbia, on 29 and 30 July 2010 to work through two implementation problems, a falls prevention initiative, and the development of a large collaborative project between the University and the Vancouver Island Health Authority.

Figure 1
figure 1

NPT Toolkit - Web-interface - Sliding Toolbar.

Figure 2
figure 2

NPT Toolkit - Reporting page - Individual Radar Plots for each Construct.

Results

Responses

As Table 1 shows, we emailed a criterion sample of 60 researchers, and achieved a response of 50/60 between 1 and 21 June 2010. In addition to our criterion sample, we received responses from nine other 'snowball' respondents. Of the 10 members of the criterion sample that did not respond, four were away on sabbatical or other leave. We have no information about six other non-respondents. Of the criterion sample, 10/50 communicated their views about statements and explanations by email or telephone to CRM. Only one member of this group provided a detailed critique of the statements and explanations. The remainder made general comments about their focus and orientation. The majority of data we received was derived from 40 criterion sample respondents, and the nine snowball sample respondents who replied using the Survey Monkey tool. We have combined responses from these two groups for qualitative analysis. Table 2 describes the structure and geographical distribution of the combined study group.

Table 1 Purposive Sample of Respondents: Statement development phase
Table 2 Professional structure of combined criterion and snowball samples: Statement development phase

Respondents using the SurveyMonkey pro forma asserted that they were familiar with NPT. Only 12 suggested that they possessed a low level of familiarity with the theory. We asked participants to read the statements and their explanations and to work through them in relation to an implementation practice or research problem. These respondents applied NPT to a wide variety of problems. Not all respondents provided sufficient information to identify these, but we could identify problems related to Primary Care (n = 14), Hospital Medicine (n = 7), Nursing and Midwifery (n = 6), Health Informatics (n = 5), Social Care (n = 4), and Public Health (n = 3). Ten respondents identified themselves as already using NPT as a basis for ongoing studies, and six were, or had been, involved in designing studies in which NPT was integral but which were not yet operational. In at least five of these cases, this work was accomplished in groups. A further 23 respondents said that they had reviewed the statements and their explanations through the medium of thought experiments about potential or actual implementation projects. A small number of respondents told us about the time committed to this task. This ranged from 20 minutes to three hours.

The web-enabled tool had been released for testing in a way that maximized commentary from real users. We embedded Google Analytics html code in the website and this enabled us to obtain some limited data about its usage and users. During the pilot period (26 July -26 August 2010) the website attracted 327 visits (139 new visitors and 188 return visitors) and details of these are given in Table 3. Time on site ranged from 21 to 0 minutes (mean was 4.15 minutes), and page views ranged from 21 to one (mean was 5.11). From 139 new visitors we received some 15 detailed comments on their experience of the site, using free text boxes that users could fill in as they worked through the site.

Table 3 Visits to http://www.normalizationprocess.org: beta testing phase

The on-line survey

All but three participants were supportive of the approach we had taken and about the statements presented to them. Many made enthusiastic comments about this, and remarked that the statements improved the workability of NPT in practice. This was especially so amongst those without a background in the social sciences. We had invited respondents to be critical, however, and most had important and useful comments to make. These took two forms. First, many respondents offered specific criticisms about the statements and their explanations. These are grouped and described in Additional File 1 (see second column, 'Users' Critique'). They related to three main kinds of problem: ambiguously worded statements and explanations; overlap, where some statements and their explanations appeared to cover the same ground as others; and dissonance, where some statements and their explanations appeared to express different concepts. As we have noted, most respondents were very positive about the statements and explanations. A medical researcher told us that:

It provided food for thought about the issues involved in trying to bring together a team of both researchers and practitioners to design and implement an intervention. In particular it helped me to understand that the reasons why we are having so much difficulty is that the research team themselves do not have a shared view and understanding of what the intervention is we are trying to implement and this is contributing to our problems in engaging the primary care partners in the project.

A nurse researcher told us that:

The questions serve as an inventory; anticipatory guidance before embarking on a change in practice or as a reflective/evaluation tool. In my example, the intervention was introduced to the inter-disciplinary [team] as a 'pilot'. I was asked to assist with evaluating the 'pilot'. If these 16-questions would have been available I could envision utilizing them as a guide for evaluation focus groups/interviews with end-users.

In these contexts, respondents seemed to be using the statements and their explanations in exactly the way we had intended them - as sensitizing tools, heuristic devices, to support thinking through an implementation task. Importantly, though, we did not intend these statements to be used as the basis for specific research instruments or as verbatim statements for an interview schedule.

Beyond this, respondents offered interesting and useful general critiques that often made wider methodological points. One health services researcher wrote that:

[I] can see why it is seductive. I imagine some of it might work for trial interventions where you have a clear comparator - e.g. differentiate the intervention from usual practice (our 'intervention' is the work now and we do not really have a comparator as such). It looks helpfully simple (so will appeal to many because of this) - not too long - easy to read - etc but then using it, it unravels and seems less useful (I feel a bit the way I did the first time I used the SF36 in a face to face interview - I ended up wanting to qualify every answer)

This reflects the central problem with the process of translation and simplification. It reduces the potential for acknowledging complexity within the tool. But there is a further problem here which is the extent to which a small number of respondents saw themselves reading something that was analogous to a structured research instrument rather than a set of statements that were intended to sensitize users to process problems in implementation. Complexity was added, too, by the use of theoretical vocabulary within the explanations and beyond. Another respondent wrote that:

I felt some of the language was still too technical. I would not use your technical descriptions "differentiation" etc - just ... complicate the understanding of the concept by using words which could be interpreted as having a different meaning to the one expressed in the question. Specific examples: 3 "make sense of the work" - would understand better as "make sense of what they had to do" (and work in 7) 8 "define the actions and procedures" - perhaps "define what needs to be done" 9 "enact the intervention" - perhaps "carry out the intervention" 10 see above 14 and 15 - I prefer "think it is worthwhile" or "agree about the worth of the effects" - it is the phrase "worth of the effects" which feels a little foreign.

While for others it was:

A little tricky to work with at times. The terms don't always appear to coincide with the descriptions provided. Sometimes it was helpful to simply ignore the term, and concentrate on the description. Furthermore, the bolded "headline" doesn't always convey what is indicated in the explanation below it.

Several respondents remarked on the problem of seeking to integrate understanding the statements and their explanations at a more general level.

I am not sure if having 2 bits of text i.e. question and description for each question might confuse some people (as I have had this mentioned to me at a conference when I did something similar) although personally I do feel it helps the users understanding and quite like it.

Once again, these problems stem from the process of reduction and editing that led to the construction of the statements and their explanations. A small number of respondents sought to suggest solutions to such problems. For example:

It might be best to have a two part question with an amplification of the question in the second part. For example, "participants can/could discover the effects of the intervention", for example "from formal or informal evaluation". Also the "questions" are not phrased as questions but as statements - would be better as questions.

The qualitative analysis that we present here is a simple and descriptive one. Data was in the form of free text entries in an on-line pro-forma. Respondents invested a good deal of effort in working through the statements and their explanations. As we have seen, they identified problems that were about meaning (focusing on the content of statements and their explanations), and about structure (focusing on the relationship between individual statements and their explanations).

Responses to the web enabled tool

We received a small number of electronic and in-person responses to the web enabled tool. Most of these were congratulatory. One respondent - a sociologist - felt that the web-enabled tool over-simplified NPT and meant that it would be difficult to interpret. Two respondents pointed to continuing difficulties with continued ambiguity or overlap for statements 2 & 14, 3 & 15, 5 & 11, 6 & 7. To solve this problem we amended these items again. Other users sought more advice about how to solve implementation problems, and a reduction in 'jargon'. For one user, however, the result was clarity and workability:

Love it, at least I can understand it now. All I need to remember is SPAM (sense-making, participation, action, monitoring). This will be a great tool to map progress.

Despite the undesirable mnemonic 'SPAM', this was the result that we were aiming for.

Final set of statements

The key result of this process was a set of statements that expressed in the simplest possible terms the components of the four constructs of Normalization Process Theory, and that could be applied in practice as heuristic tools implementation and evaluation problems. The final set of statements produced through this process was:

  1. 1.

    participants distinguish the intervention from current ways of working

  2. 2.

    participants collectively agree about the purpose of the intervention

  3. 3.

    participants individually understand what the intervention requires of them

  4. 4.

    participants construct potential value of the intervention for their work

  5. 5.

    Key individuals drive the intervention forward

  6. 6.

    participants agree that the intervention should be part of their work

  7. 7.

    participants buy into the intervention

  8. 8.

    participants continue to support the intervention

  9. 9.

    participants' perform the tasks required by the intervention

  10. 10.

    participants maintain their trust in each other's work and expertise through the intervention

  11. 11.

    the work of the intervention is allocated appropriately to participants

  12. 12.

    the intervention is adequately supported by its host organization

  13. 13.

    participants access information about the effects of the intervention

  14. 14.

    participants collectively assess the intervention as worthwhile

  15. 15.

    participants individually assess the intervention as worthwhile

  16. 16.

    participants modify their work in response to their appraisal of the intervention

Discussion

Respondents' critical comments on statements and explanations, as we have noted, were important and useful. We learned much about how the statements and their explanations were read and understood by a purposive international sample of researchers and practitioners. While respondents were enthusiastic and supportive about the statements and explanations, and valued the translation work that they represented, they also provided criticisms that focused our attention on problems in the way that the theory was understood when it was simplified in this way. This left us with three problems to solve.

First of all it was clear that we needed to rephrase individual statements to make their meanings clear, and to reduce problems of 'fine distinction' and overlap that affected some of them - especially in relation to statements 2, 3, 14 and 15. In fact, we rewrote almost all of the statements, working not only to clarify their meanings but also their purpose as heuristic devices to help users think through implementation processes rather than measure them. This involved producing and then choosing - by means of a simple vote by each member of the project team - alternative forms of words for each statement, and where necessary the explanation. We then undertook a final amendment phase to make them workable. The progression from original statement and explanation - through respondents' criticisms, alternative wording, voting choice, and to final version - is shown in detail in Additional File 1. We repeated this process after users had responded to the web-enabled tool. This led to the final set of statements.

The second problem was whether to do additional work to marry statements and their explanations more effectively, or whether to remove the explanations themselves. Some respondents had made a strong case for removing the explanations on the grounds that they would confuse novice users or distract expert ones. In this context, we also had to take account of the usability of the statement and explanation in the on-line toolkit. The combination of these factors led us to decide to include explanations on the web-interface (see Figure 1), and on its reports (see Figure 2). They are also embedded elsewhere in the on-line Users' Manual for NPT, where they are linked to more detailed accounts of the theory's constructs.

Finally, and rather less importantly, we had to decide whether or not to acknowledge the specific theoretical origin of each statement by assigning it the name of the component of NPT to which it referred. We chose to drop these from the toolkit. However, they remain elsewhere in the on-line User's Manual. The limitations of this study are that our sample may be biased towards a favourable view of NPT by virtue of their previously expressed interest in NPT and earlier personal contacts. A second limitation is that is it also biased towards respondents working in some capacity in academia over those working as full-time practitioners. As such, the practitioner group is relatively small and this may have implications on the potential usability of the tool for this group. Clearly, irrespective of researcher enthusiasm, practitioners, managers and policy makers, alongside patients and careers, are central to the successful embedding of interventions. However, we should note that many of these academics also had commitments as clinical practitioners, healthcare managers, and policy makers. A third limitation is that limitations on time and resources did not permit us at this stage in the project to perform cognitive interviews in which users of the statements worked through them while thinking out loud. Overall, using email and a web based tool to collect qualitative (textual) data from a purposive sample of international researchers and practitioners was highly successful, with a very small number of non-respondents. The 59 researchers and practitioners who responded to our qualitative data collection tool, and the 13 who commented on the beta version of the toolkit at http://www.normalizationprocess.org were supportive and helpful, and consistently provided us with valuable critical comments.

Conclusions

The funding program that supported the work described in this paper was intended to support the translation of social science research into products that would have value for the wider polity. Our aim in this paper is to show how we worked towards this objective. Our aim for the project itself was to take the core constructs and components of a sociological theory and translate them into the simplest possible set of statements. These statements were designed to be used as heuristic devices in an on-line toolkit for users of the theory, and not to define questions that could be used as the basis of an instrument to measure variables derived from NPT's constructs and their components. As a result of this work we have been able to develop a simplified set of statements and explanations that translate a sociological theory into a 'user friendly' form of words. This is an important step in crossing the translational gap between the complex language of academic expert communities and the multiple everyday needs of researchers and practitioners in applied settings [46].

Conflict of interests

CRM led the program of theory building underpinning work described here, and all authors have contributed to the development of NPT.

References

  1. Campbell M, Fitzpatrick R, Haines A, Kinmonth A, Sandercock P, Spiegelhalter D, Tyrer P: Framework for design and evaluation of complex interventions to improve health. BMJ. 2000, 321: 694-696. 10.1136/bmj.321.7262.694.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Campbell NC, Murray E, Darbyshire J, Emery J, Farmer A, Griffiths F, Guthrie B, Lester H, Wilson P, Kinmonth AL: Designing and evaluating complex interventions to improve health care. Brit Med J. 2007, 334 (7591): 455-459. 10.1136/bmj.39108.379965.BE.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Angus D, Brouwers M, Driedger M, Eccles M, Francis J, Godin G, Graham I, Grimshaw J, Hanna S, Harrison MB, et al: Designing theoretically-informed implementation interventions The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG). Implement Sci. 2006, 1.

    Google Scholar 

  4. Grol RP, Bosch M, Hulscher M, Eccles M, Wensing M: Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Quarterly. 2007, 85 (1): 93-138. 10.1111/j.1468-0009.2007.00478.x.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Goldthorpe JH: On Sociology: Critique and Program. 2006, Stanford: Stanford University Press

    Google Scholar 

  6. Hechter M, Horne C: Theory is explanation. Theories of social order. Edited by: Hechter M, Horne C. 2003, Stanford CA, Stanford University Press

    Google Scholar 

  7. May C, Finch T: Implementation, embedding, and integration: an outline of Normalization Process Theory. Sociology. 2009, 43 (3): 535-554.

    Article  Google Scholar 

  8. May C: 2006 A rational model for assessing and evaluating complex interventions in health care. BMC Health Services Research. 2006, 6: 86-10.1186/1472-6963-6-86.

    Article  PubMed  PubMed Central  Google Scholar 

  9. May C, Finch T, Mair F, Ballini L, Dowrick C, Eccles M, Gask L, MacFarlane A, Murray E, Rapley T, Rogers A, Treweek S, Wallace P, Anderson G, Burns J, Heaven B: Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Services Research. 2007, 7: 148-10.1186/1472-6963-7-148.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Ajzen I: The theory of planned behavior. Organizational Behavior and Human Decision Processes. 1991, 50: 179-211. 10.1016/0749-5978(91)90020-T.

    Article  Google Scholar 

  11. Rogers EM: The Diffusion of Innovations. 1995, New York: Free Press, 4

    Google Scholar 

  12. Latour B: Reassembling the Social: An Introduction to Actor Network Theory. 2005, Oxford: Oxford University Press

    Google Scholar 

  13. Bunge M: How does it work? The search for explanatory mechanisms. Philosophy of the Social Sciences. 2004, 34 (2): 182-210. 10.1177/0048393103262550.

    Article  Google Scholar 

  14. Gerring J: The mechanismic worldview: Thinking inside the box. British Journal of Political Science. 2008, 38: 161-179.

    Article  Google Scholar 

  15. Lieberson S, Lynn FB: Barking up the wrong branch: Scientific alternatives to the current model of sociological science. Annual Review of Sociology. 2002, 28: 1-19. 10.1146/annurev.soc.28.110601.141122.

    Article  Google Scholar 

  16. Gask L, Lever-Green G, Hays R: Dissemination and implementation of suicide prevention training in one Scottish region. BMC Health Serv Res. 2008, 8.

    Google Scholar 

  17. Gask L, Rogers A, Campbell S, Sheaff R: Beyond the limits of clinical governance? The case of mental health in English primary care. BMC Health Serv Res. 2008, 8.

    Google Scholar 

  18. May CR, Montori VM, Mair F: Understanding patients' experiences of treatment burden in chronic heart failure using normalization process theory. Annals of Family Medicine.

  19. Coburn NG, Guller U, Baxter NN, Kiss A, Ringash J, Swallow CJ, Law CHL: Adjuvant therapy for resected gastric cancer-rapid, yet incomplete adoption following results of intergroup 0116 trial. Int J Radiat Oncol Biol Phys. 2008, 70 (4): 1073-1080. 10.1016/j.ijrobp.2007.07.2378.

    Article  PubMed  Google Scholar 

  20. Kirsh S, Lawrence R, Aron D: Tailoring an intervention to the context and system redesign related to the intervention: A case study of implementing shared medical appointments for diabetes. Implementation Science. 2008, 3: article 34.

    Article  Google Scholar 

  21. May C, Mair F, Dowrick C, Finch T: Process evaluation for complex interventions in primary care: understanding trials using the normalization process model. BMC Family Practice. 2007, 8: 42-10.1186/1471-2296-8-42.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Speed C, Heaven B, Adamson A, Bond J, Corbett S, Lake AA, May C, Vanoli A, McMeekin P, Moynihan P, et al: LIFELAX - diet and LIFEstyle versus LAXatives in the management of chronic constipation in older people: randomised controlled trial. Health Technol Assess. 2010, 14 (52): 1-+.

    Article  CAS  PubMed  Google Scholar 

  23. Gask L, Bower P, Lovell K, Escott D, Archer J, Gilbody S, Lankshear A, Simpson A, Richards D: What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model. Implement Sci. 2010, 5 (15).

  24. Gunn JM, Palmer VJ, Dowrick CF, Herrman HE, Griffiths FE, Kokanovic R, Blashki GA, Hegarty KL, Johnson CL, Potiriadis M, et al: Embedding effective depression care: using theory for primary care organisational and systems change. Implement Sci. 2010, 5.

    Google Scholar 

  25. Kennedy A, Chew-Graham CA, Blakeman T, Bowen A, Gardner C, Protheroe J, Rogers A, Gask L: Delivering the WISE (Whole Systems Informing Self-Management Engagement) training package in primary care: learning from formative evaluation. Implement Sci. 2010, 5 (7).

  26. Pencille LJ, Campbell ME, Van Houten HK, Shah ND, Mullan RJ, Swiglo BA, Breslin M, Kesman RL, Tulledge-Scheitel SM, Jaeger TM, et al: Protocol for the Osteoporosis Choice trial. A pilot randomized trial of a decision aid in primary care practice. Trials. 2009, 10.

    Google Scholar 

  27. Elwyn G, Légaré F, van der Weijden T, Edwards A, May C: Arduous implementation: does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice. Implement Science. 2008, 3: 57-10.1186/1748-5908-3-57.

    Article  Google Scholar 

  28. Mair FS, Hiscock J, Beaton SC: Understanding factors that inhibit or promote the utilization of telecare in chronic lung disease. Chronic Illness. 2008, 4 (2): 110-7. 10.1177/1742395308092482.

    Article  PubMed  Google Scholar 

  29. King G, Richards H, Godden D: Adoption of telemedicine in Scottish remote and rural general practices: a qualitative study. Journal of Telemedicine and Telecare. 2007, 13 (8): 382-6. 10.1258/135763307783064430.

    Article  PubMed  Google Scholar 

  30. Gagnon M-P, Legare F, Fortin J-P, Labrecque M, Lamothe L, Duplantie J: An integrated strategy of knowledge application for optimal e-health implementation: A multi method study protocol. BMC Medical Informatics and Decision Making. 2008, 8: article 17.

    Article  Google Scholar 

  31. Obstfelder A, Engeseth KH, Wynn R: Characteristics of successfully implemented telemedical applications. Implementation Science. 2007, 2: article 25.

    Article  Google Scholar 

  32. Boddy D, King G, Clark J, Heaney D, Mair F: The influence of context and process when implementing e-health. BMC Medical Informatics and Decision Making. 2009, 9 (1): 9-10.1186/1472-6947-9-9.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Murray E, Burns J, May C, Finch T, O'Donnell C, Wallace P, Mair F: Why is it difficult to implement e-health initiatives? A qualitative study. Implementation Science.

  34. Gagnon MP, Legare F, Fortin JP, Lamothe L, Labrecque M, Duplantie J: An integrated strategy of knowledge application for optimal e-health implementation: A multi-method study protocol. BMC Med Inform Decis Mak. 2008, 8.

    Google Scholar 

  35. Halford S, Obstfelder A, Lotherington AT: Changing the record: the inter-professional, subjective and embodied effects of electronic patient records. New Technol Work Employ. 2010, 25 (3): 210-222. 10.1111/j.1468-005X.2010.00249.x.

    Article  Google Scholar 

  36. King G, Richards H, Godden D: Adoption of telemedicine in Scottish remote and rural general practices: a qualitative study. J Telemed Telecare. 2007, 13 (8): 382-386. 10.1258/135763307783064430.

    Article  PubMed  Google Scholar 

  37. Boddy D, King G, Clark J, Heaney D, Mair F: The influence of context and process when implementing e-health. Bmc Med Inform Decis. 2009, 9 (1): 9-10.1186/1472-6947-9-9.

    Article  Google Scholar 

  38. Murray E, Burns J, May C, Finch T, O'Donnell C, Wallace P, Mair F: Why is it difficult to implement e-health initiatives? A qualitative study. Implement Sci. 2011, 6.

    Google Scholar 

  39. Stinchcombe A: Constructing social theories. 1968, New York: Harcourt, Brace and World

    Google Scholar 

  40. May C, Mair FS, Finch T, MacFarlane A, Dowrick C, Treweek S, Rapley T, Ballini L, Ong BN, Rogers A, et al: Development of a theory of implementation and integration: Normalization Process Theory. Implement Sci. 2009, 4 (29).

  41. Mair F, May C, Murray E, Finch T, Anderson G, O'Donnell C, Wallace P, Sullivan F: Understanding the Implementation and Integration of E-Health Services London: National Co-ordinating. 2009, Centre for the National Institute for Health Research Service Delivery and Organisation Programme (NCCSDO)

    Google Scholar 

  42. May C, Finch T, Cornford J, Exley C, Gately C, Kirk S, Jenkings KN, Mair FS, Osbourne J, Robinson AL, et al: Integrating Telecare for Chronic Disease Management in the Community: What Needs to be Done?. 2009, London: NIHR

    Google Scholar 

  43. Ritchie J, Spencer L: Qualitative data analysis for applied policy research. Analysing Qualitative Data. Edited by: Bryman A, Burgess R. 1994, London: Routledge, 173-194.

    Chapter  Google Scholar 

  44. Banister P, Burman E, Parker I, Taylor M, Tindall C: Qualitative methods in psychology. 1994, Buckingham: Open University Press

    Google Scholar 

  45. Murray E, May C, Mair F: Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT). BMC Medical Informatics and Decision Making. 2010, 10 (61).

  46. Murray E, Treweek S, Pope C, MacFarlane A, Ballini L, Dowrick C, Finch T, Kennedy A, Mair F, O'Donnell C, et al: Normalisation Process Theory: a framework for developing, evaluating and implementing complex interventions. BMC Medicine. 2010, 8 (63).

Pre-publication history

Download references

Acknowledgements & Funding

Research and website development reported in this paper was made possible by funding from the UK Economic and Social Research Council (RES-189-25-0003). This grant was held by CRM, EM, FSM, TR, TF and ST. We thank Jan Legge and Orla O'Donnell for providing secretarial support. Some aspects of the work also benefited from the award of a grant to EM and CRM of National Institutes for Health Research funding for a National School of Primary Care Research Peer Learning Set on the development of Normalization Process Theory and we thank members of that group--Anne Rogers, Catherine Pope, Anne Kennedy, Bio Nie Ong, Chirs Dowrick, Rob Wilson, Kate O'Donnell, and Stephanie Tooth for their support of this enterprise. In addition, we thank Victor Montori, Nilay Shah, David Eton, Mary Ellen Purkis, Lynn Stevenson, and Chris May for their important contributions to our work. Views presented in this paper are those of the authors and not of the UK Economic and Social Research Council or Department of Health.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Rapley.

Additional information

Authors' contributions

All authors have contributed practically and intellectually to the work that led to this paper and have commented and agreed on the manuscript. CRM led the study. TR led the development of the web-enabled tool.

Electronic supplementary material

12913_2010_1804_MOESM1_ESM.PDF

Additional file 1: Progression from Original Statement and Explanation to beta testing phase. Table in landscape format. (PDF 121 KB)

12913_2010_1804_MOESM2_ESM.PDF

Additional file 2: Qualitative Data Collection Using On-Line. Pro Forma - Questions Asked. List of questions asked on online survey. (PDF 37 KB)

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

May, C.R., Finch, T., Ballini, L. et al. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit. BMC Health Serv Res 11, 245 (2011). https://doi.org/10.1186/1472-6963-11-245

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6963-11-245

Keywords