Our observations suggest that NHS-DPP delivery was generally in line with the NHS service specification [9] with regards to the session content and processes of the NHS-DPP, according to the data extracted using the TIDieR framework [13]. Researcher field notes indicated positive patient experiences, as well as negative patient experiences in relation to the structure of the NHS-DPP. There are significant organisational differences and modes of delivery which appear to have generated both positive and negative responses from patients of the NHS-DPP. In particular, there appeared to be more positive patient experiences observed within sessions (e.g. group rapport, engagement with interactive activities) and more negative patient experiences appeared to be linked to structural issues (e.g. session scheduling, group sizes, the venue, and issues with resources). More instances of positive patient experiences were observed when provider programmes had more visual and interactive activities, delivered in groups of 10–15 people. Thus, there are some improvements which, if addressed by providers, may improve the overall running of the programme.
Strengths and limitations
To our knowledge, this is the first paper describing NHS-DPP delivery, including structural features, within-session process features and observed patient experience. We were able to observe the whole NHS-DPP course at each of the eight sites and observed a wide range of facilitators with varying experience and backgrounds. All sessions were audio-recorded, a ‘gold standard’ for fidelity evaluations [22], enabling a more detailed data extraction of the programme delivery. The use of the TIDieR framework [13] allowed for the complexities of delivering such an intervention in practice to be transparently documented.
This paper has provided a novel way of assessing patient experience, utilising in-depth observational notes across a much larger sample than would have been possible with qualitative research. Thorough observational notes were written (up to two sides of A4) per session. However, the patient experience described in this paper was that observed by researchers and documented in their notes, thus, observed patient experience was researchers’ interpretation. Only the corresponding field notes for each session were analysed to assess observed patient experience, as the audio recordings further presented over 200 hours of data. However, researchers’ observational notes were able to capture occurrences within sessions and non-verbal aspects of delivery which would not have otherwise been captured on the audio recording (e.g. informal conversations with patients, group interactions during activities). The use of researcher field notes are not often used to analyse or present data, however, we have found this a useful method to provide an in-depth analysis of what happened within sessions, especially with regards to the structural issues observed. These are valuable insights into the running of a national programme and may contribute to its future success in ensuring that the intervention continues.
Researchers were only able to observe and document volunteered views in their field notes, thus the data presented is naturally occurring data. For example, if a particular observation was not documented in researchers’ field notes for that session (such as high engagement), it does not mean it did not happen in that session. Further, researchers could only observe and interact with patients who continued attendance at the NHS-DPP. Consequently, this study cannot provide insights into barriers of attendance; a limitation also present in previous research on patient experience of the NHS-DPP [16]. Despite not having any formal estimation of observation reliability, researchers discussed what should be included in the observational notes beforehand, and both researchers documented similar types of observations in their notes (see Additional files 2 and 3). Further, despite having single observers at the sessions, both researchers each attended the delivery of a whole programme for each of the four providers delivering the NHS-DPP.
Although we attended the full NHS-DPP course at eight sites across England, the sample of eight geographical sites is still small, yet all that was feasible, given the length of the NHS-DPP programme (9–12 months), and the resources required for intensive observation. Further, we cannot be sure from these observations whether the issues identified are related to the particular courses we observed, or systemic issues that reflect the way NHS-DPP providers organise and run their courses. However, we have been able to identify associations between providers’ programme characteristics and observed patient experience at the eight sites attended.
Despite this, our purposive sample sought to assess diversity in NHS-DPP settings; researchers aimed to sample sites with as much variation in SES, ethnicity and geographical location as possible and 455 participants were consented. This approach has additional advantages to qualitative research examining patient experience, which gathers in-depth views from a smaller number of participants, usually somewhat restricted by geographical site. Our study includes observational data rather than self-report, thus does not have the same reporting biases. Although, it could be argued that the observed patient experience is not as direct as gathering views on patient experience in interviews. Nonetheless, this study provides a description into the patient experiences observed in the NHS-DPP, in which the analysis has the advantage of identifying broad patterns in the data.
Relation to existing research
The only previous research to explore patients’ experiences of the NHS-DPP was during the pilot phase [16]. However, this interview study was conducted before the NHS-DPP was implemented nationally, patient data were analysed alongside data from other NHS-DPP stakeholders, and the comparison of patient experiences across sites and providers was not within the previous evaluation’s scope. Nonetheless, our observations are in accordance with those findings, as positive relationships were observed within groups, both with other peers and facilitators, and patients reported positive behavioural changes made to their lifestyles. The current study has further highlighted two new findings relating to the delivery of the NHS-DPP; the use of interactive and visual activities within NHS-DPP sessions (process features) appear to enhance patient experience, but issues with structural features of the programme such as session scheduling, group size and issues with session resources appear to impact negatively on patient experience.
Whilst previous qualitative studies on patient experiences of other behavioural interventions have provided an in-depth insight into interpersonal factors and participant motivations [23,24,25], they have not fully addressed the structural factors which are also important for the successful implementation of an intervention. Although the qualitative literature did allude to the fact that having the programme sites in more accessible locations with greater flexibility in session times and days could further improve patient experience [23,24,25], this has not been extensively researched. However, such structural features can have a direct influence on the processes and outcomes in programmes, for example, if there is insufficient resources and equipment or sessions cannot be scheduled, this can prohibit patients from accessing the support they require from the programme [26]. As found in the present study, the scheduling of sessions, group sizes, issues with provider resources and site venues were all structural issues which appeared to negatively affect patient experience. Recent data published on the early outcomes of the NHS-DPP highlight that course completion differed between providers [8]; upon comparing our data with this early outcome data, it appears that the providers with the lowest course completion rates [8] had more scheduling issues observed in the current study.
Implications for practice
We observed disengagement within the sessions when patients reported information was difficult to understand, when there were issues with obtaining session resources, and when group sizes were greater than 15 people. Although we cannot be certain whether these are systemic issues in the way NHS-DPP providers are delivering their courses, our findings suggest that delivering more interactive activities with less complex information and having enough resources to supplement the session content may enhance patient experience. Given the NHS-DPP is the first national roll-out of a diabetes prevention programme to ever be implemented in routine practice, our findings may be of great value to make improvements to future waves of the programme, or for commissioners of other public health initiatives.
Despite the NHS-DPP being a national programme, there is variation in how the intervention is delivered by providers. There are clearly some deviations from providers’ protocols and what was specified by NHS England [9], but we do not know whether this variation is also present in other behavioural programmes. The importance of tailoring the intervention content according to the group demographics if often argued, known as adaptation in form (e.g. variation within providers according to local context) [17, 27]. However, other types of variation between providers (e.g. group sizes) may be explained by pressures faced by providers such as waiting lists to get onto the course, people wanting to switch courses, local insights of ‘did-not-attend’ rates, or incentives for commercial providers to enrol patients onto the NHS-DPP. Such variation is not adaptation, but suggests drift from the original NHS Service Specification [9], especially at the structural level with regards to the scheduling of sessions and group sizes. Too much drift from the specification will result in the NHS-DPP not being delivered with fidelity to the evidence base, and it is unclear how that would impact on effectiveness. Such structural issues observed highlight the wider issues of rolling out a national programme. The NHS-DPP started with seven small pilot sites in 2015 and rolled out to a multi-site programme in 2016, with commercial providers under pressure to deliver results with limited capacity across large geographical areas.
Implications for research
Now the NHS-DPP is in the fourth year of implementation, some of these structural issues may have since been improved, although this is not certain. There is now a fifth provider commissioned to deliver the NHS-DPP alongside the other four providers; further observations of the NHS-DPP in the field would be beneficial to establish whether these structural features (e.g. session scheduling) remain an issue, or whether such issues are only present in the early stages of programme implementation. Further, it would be useful to replicate and advance this method using two researchers at each site observation and formal reliability testing, which would provide wider applicability of the current model used.
Future research could also examine the impact that different facilitator characteristics may have on the outcomes of the NHS-DPP. We observed 36 facilitators from diverse backgrounds with varying levels of experience and facilitating styles. Our observations suggested that good relationships with facilitators were linked with positive patient experience, but it is not clear which features of facilitators or training best brings this about. Further, we do not know the impact of the therapeutic relationship between the facilitator and the group on learning and retention of the NHS-DPP. Qualitative interviews with facilitators about their views and experiences of delivering different aspects of the NHS-DPP would give insight into additional requirements for facilitators going forward. Lastly, the authors of the present study have also assessed fidelity of delivery of behaviour change techniques in the NHS-DPP which is described in a separate publication [28].