Skip to main content
  • Research article
  • Open access
  • Published:

The Care Process Self-Evaluation Tool: a valid and reliable instrument for measuring care process organization of health care teams

Abstract

Background

Patient safety can be increased by improving the organization of care. A tool that evaluates the actual organization of care, as perceived by multidisciplinary teams, is the Care Process Self-Evaluation Tool (CPSET). CPSET was developed in 2007 and includes 29 items in five subscales: (a) patient-focused organization, (b) coordination of the care process, (c) collaboration with primary care, (d) communication with patients and family, and (e) follow-up of the care process. The goal of the present study was to further evaluate the psychometric properties of the CPSET at the team and hospital levels and to compile a cutoff score table.

Methods

The psychometric properties of the CPSET were assessed in a multicenter study in Belgium and the Netherlands. In total, 3139 team members from 114 hospitals participated. Psychometric properties were evaluated by using confirmatory factor analysis (CFA), Cronbach’s alpha, interclass correlation coefficients (ICCs), Kruskall-Wallis test, and Mann–Whitney test. For the cutoff score table, percentiles were used. Demographic variables were also evaluated.

Results

CFA showed a good model fit: a normed fit index of 0.93, a comparative fit index of 0.94, an adjusted goodness-of-fit index of 0.87, and a root mean square error of approximation of 0.06. Cronbach’s alpha values were between 0.869 and 0.950. The team-level ICCs varied between 0.127 and 0.232 and were higher than those at the hospital level (0.071-0.151). Male team members scored significantly higher than females on 2 of the 5 subscales and on the overall CPSET. There were also significant differences among age groups. Medical doctors scored significantly higher on 4 of the 5 subscales and on the overall CPSET. Coordinators of care processes scored significantly lower on 2 of the 5 subscales and on the overall CPSET. Cutoff scores for all subscales and the overall CPSET were calculated.

Conclusions

The CPSET is a valid and reliable instrument for health care teams to measure the extent care processes are organized. The cutoff table permits teams to compare how they perceive the organization of their care process relative to other teams.

Peer Review reports

Background

Recent studies on care quality improvement and patient safety show that health care is still not safe and that the number of adverse events is underestimated [13]. Health care across organization levels is often poorly organized, complex, and uncoordinated. Furthermore, not all patients receive consistent, high-quality medical care. The organization of care can be more effective when multidisciplinary teams are involved in its organization and if care is organized around medical conditions and care processes [4, 5].

Care processes contain key interventions that support the diagnosis or treatment of patients. These key interventions contain unique bundles of products and services, and temporary firms can effectively achieve their delivery. Temporary firms are health care providers who operate together when a specific well-defined patient group is admitted to a care facility or institution. These teams contain members from different health care professions, have a shared clinical purpose, and have direct care responsibilities [6, 7]. They work in a complex environment, under interactive and dynamic conditions, and their membership frequently changes. Therefore, these teams are action teams [7, 8]. The challenges for these teams are effective communication, coordination, and control over the care process [7].

A care process has five key characteristics that affect the organization of care: coordination of the care process, patient-focused organization, communication with patients and family, collaboration with primary care, and follow-up of the care process [9]. The Care Process Self-Evaluation Tool (CPSET) assesses these five key characteristics by using a 29-item Likert scale. It is based on the perceptions of team members involved in organizing a care process. The primary study assessing the validity and reliability of this tool was performed in 2007, and statistical analysis of the five factors produced Cronbach’s alpha values between 0.776 and 0.928 [9]. In 2011, an international Delphi study was performed to identify indicators that affect multidisciplinary teamwork in care processes. This study showed that CPSET is a good tool for following up improvements in multidisciplinary teamwork [10].

Since its original validation, CPSET has been used in different organizations and teams. The first aim of this study was to evaluate the stability of the psychometric properties of CPSET. The second aim was to calculate cutoff scores for the subscales and overall score, with the goal of helping health care managers rank the CPSET scores of their teams.

Methods

Study design and sample

The present study was a cross-sectional, multicenter study involving 114 organizations in Belgium and the Netherlands. Data were collected between November 2007 and October 2011. The participating organizations were members of the Belgian-Dutch Clinical Pathway Network (http://www.nkp.be) [1113]. These organizations were offered the opportunity to use the CPSET to evaluate the organization level of care processes after its original validation in 2007. Team leaders decided which team members would complete the CPSET. Teams received feedback on their scores after they sent their data to the central database at Leuven University. We received a total of 3378 questionnaires from team members. Questionnaires with more than three missing values on the 29-item CPSET were excluded from analysis. Thus, 3139 questionnaires were included for secondary analysis in this study. Informed consent was obtained from all participants. The secondary analysis was performed as part of a larger study with the ethical approval of the University Hospitals of Leuven for Belgium (identifiers: B32220096036 and B32330096038) [14].

Measures

A two-section questionnaire form was used as a data collection instrument. The first part of the questionnaire collected data on demographic characteristics, which included age, gender, profession, and which kind of care process was evaluated. The second part of the questionnaire consisted of the 29-item CPSET. Each item was scored on a 10-point ordered, categorical scale, ranging from totally disagree (1) to totally agree (10) [9]. An average score per subscale (%) and an overall score (%) were calculated.

Statistical analysis

We performed secondary data analysis on 3139 questionnaires. Descriptive statistics were used to analyze the demographic characteristics. SPSS version 19.0 was used (SPSS Inc., Chicago, IL, USA). Confirmatory factor analysis (CFA) was used to test the structure of the scale. Normed fit index (NFI), comparative fit index (CFI), and adjusted goodness-of-fit index (AGFI) values ≥0.90, and root mean square error of approximation (RMSEA) values < 0.08 were considered to indicate a good fit [15, 16]. Cronbach’s alpha analysis was used to measure internal consistency. Cronbach’s alpha values range between 0 and 1, and a high Cronbach’s alpha means that the items are strongly correlated [17]. CFA, NFI, CFI, AGFI, RMSEA, and Cronbach’s alpha were determined using SAS® software (SAS Institute Inc., Cary, NC, USA).

The degree of within-cluster dependence, interclass correlation coefficients (ICCs), and confidence intervals were calculated using a formula derived by Donner and Klar based on analysis of variance [18]. Team- and hospital-level ICCs were calculated to determine to what extent scores at these levels correlate with each other. The Kruskal-Wallis test was used to compare CPSET scores of more than two independent groups. We used this test to compare differences between profession and age and differences between the overall CPSET and its five subscales. The Mann–Whitney test, a test used to assess differences between two independent groups, was used to assess differences between gender, profession, and age of all categories. Kruskal-Wallis tests and Mann–Whitney tests were extracted from StatsDirect 2.7.8 (StatsDirect Ltd, Altrincham, UK) and SPSS version 19.0 (SPSS Inc., Chicago, IL, USA). The level of significance was set to p  < 0.05.

Results

Respondents

This study included participants from 92 organizations in Belgium and 22 in the Netherlands. These organizations can be classified as acute hospitals (n = 88), psychiatric hospitals (n = 2), specialized hospitals (n = 9), and primary care organizations (n = 15). Overall, 48.3% of the participants were 40 years or older, 60.31% were female, and more than half (54.76%) were nurses. In total, 283 teams participated. Between one and 28 teams per organization participated. Some teams included a coordinator. Coordinators could be either team leaders or care process coordinators. Some teams included, for example, care logistics in their team; these were classified as others (Table 1).

Table 1 Demographic characteristics

Confirmatory factor analysis

CFA was performed on the original five-factor solution with 29 items [9]. The data of the 3139 participants were used. The four fit indices revealed a good fit for CPSET (AGFI = 0.87; RMSEA = .06; CFI = 0.94; and NFI = 0.93).

Internal consistency

Internal consistency was measured by calculating Cronbach’s alpha reliability coefficients for each of the five factors. The Cronbach’s alphas for the five factors or subscales were between 0.869 and 0.950 (patient-focused organization, alpha = 0.919; coordination of care, alpha = 0.900; communication with patient and family, alpha = 0.897; collaboration with primary care, alpha = 0.869; follow-up of care, alpha = 0.950).

Interclass correlation coefficients

We calculated ICCs at team (n = 283) and hospital levels (n = 114). ICCs for each of the five dimensions are shown in Table 2. A team was defined as more than three health care providers who work together around a care process. At the team level, subscale ICCs ranged from 0.127 (collaboration with primary care) to 0.232 (coordination of care). At the hospital level, subscale ICCs ranged from 0.071 (collaboration with primary care) to 0.151 (communication with patient and family). The ICC for the overall CPSET score at the team level was 0.221, whereas the ICC at the hospital level was 0.147. These results showed that there was poor agreement at team and hospital levels and that there was less variation within teams than within hospitals.

Table 2 ICCs and 95% confidence intervals

CPSET subscales

Gender, age, and profession of team members had a significant impact on the five subscales and the overall CPSET score. Men scored significantly higher on the subscales ‘communication with patient and family’ and ‘collaboration with primary care,’ and on the overall CPSET. Team members younger than 30 years old scored lower on the subscales ‘patient-focused organization,’ ‘communication with patient and family,’ and ‘collaboration with primary care’ and on the overall CPSET. Medical doctors scored significantly higher on the subscales ‘patient-focused organization,’ ‘coordination of care,’ ‘communication with patient and family,’ and ‘collaboration with primary care’ and on the overall CPSET. Paramedics scored significantly higher on the subscale ‘follow-up of care.’ These results are summarized in Table 3.

Table 3 Differences in team member demographics according to CPSET subscales

CPSET cutoff scores

On the basis of the individual scores of the 3139 participating health care providers, we calculated cutoff scores for the five subscales and the overall CPSET (Table 4). Percentiles were used in the cutoff table based on a normal distribution of the subscale scores. The subscales ‘communication with patient and family’ and ‘follow-up of care’ had a broader range of scores than the other subscales and the overall CPSET.

Table 4 Cutoff table

Discussion

This paper describes the psychometric properties of the CPSET and defines CPSET cutoff scores. The psychometric properties of the CPSET showed that this tool is valid and reliable for evaluating the organization of a care process as perceived by team members. The original five-factor structure with 29 items was confirmed by CFA. The reliability of the CPSET was measured by Cronbach’s alpha. The Cronbach’s alpha results in this study varied between 0.869 and 0.950 and were higher than those reported in the original 2007 study of Vanhaecht et al. (Cronbach’s alpha = 0.776-0.928) [9]. This indicates that the scale is still reliable.

A multilevel analysis was performed at team and hospital levels. Our results showed that the ICCs of scores at the team level were higher than those at the hospital level. This means that there was less variance in CPSET scores within teams than within hospitals, which was expected. The ICCs in our study were low, and most of the variation could be explained by team and hospital variations. One possible reason for the low ICCs in our study was that teams were composed of professionals from different disciplines, with each team member having different perceptions about the organization of care. As shown in Table 3, profession, age, and gender significantly influenced CPSET scores. Medical doctors scored significantly higher than other health professionals on the overall CPSET scale and on the following subscales: ‘patient focused organization,’ ‘coordination of care,’ ‘communication with patient and family,’ and ‘collaboration with primary care.’ Paramedics scored significantly higher on the subscale ‘follow-up of care.’ Men scored significantly higher on the overall CPSET scale and the subscales ‘communication with patient and family’ and ‘collaboration with primary care.’ Team members between 20 and 39 years old scored significantly lower on the subscale ‘communication with patient and family’ than those in other age categories. Significant differences were found between younger ( < 40 years) and older (> 50 years) health care professionals on the subscale ‘coordination of care.’ Coordinators of care pathways scored lower on the subscales ‘coordination of care’ and ‘follow-up of care,’ perhaps because they tend to be more critical of the organization of care.

The differences we observed in the perceptions of medical doctors and nurses are consistent with those observed in previous research. In the present study, physicians perceived teams to be more organized than nurses in terms of teamwork, collaboration, and communication with nurses, which is consistent with the finding that physicians generally perceive teamwork to be better coordinated [1921]. A negative correlation exists between professional autonomy and the level of nurse-physicians collaboration [22]. Different perspectives in communication can be caused by hierarchical factors, gender, different patient care responsibilities, different perceptions of requisite communication standards, and differences in training methods for nurses and doctors [19]. Communication skills training can improve patient-nurse communication but not patient-doctor communication. Skills training that contains patient-centered communication can increase information exchange and continuity of care for patients [23].

The lowest CPSET scores were observed on the subscale ‘communication with patient and family’ and ‘follow-up of care.’ Organizations need to improve on priorities, communication, and coordination of care, as suggested by Bates et al. [24]. Relational coordination can be used to improve ‘follow-up of care.’ This framework can lead to better quality of care for patients, and health care providers reported fewer adverse events [25]. Effective and safe hospital care depends on good teamwork. Greater teamwork results in higher patient satisfaction rates, higher nurse retention, and lower hospital costs [26]. Multidisciplinary teamwork is essential for quality health care.

The 2012 review of Deneckere et al. showed that multidisciplinary teamwork can be supported by using care pathways [7]. Care pathways can improve the work environment and organization of care, and can have a positive impact on the well-being of health care providers [7, 27, 28]. Further research is needed on using the CPSET to study the effect of coordinating mechanisms, such as care pathways, on how health care providers perceive the organization of care.

Although the CPSET has been used for several years, health care managers have problems interpreting the CPSET scores of their team members. Therefore, we compiled a table of cutoff scores that will permit health care managers and team members to compare how they perceive the organization of their care process relative to other team members. By using the cutoff table as a starting point, team members can discuss how they can improve the organization of care. When different teams of the same care facility or institution complete the CPSET, the cutoff scores will help health care managers rank the teams in that facility or institution. However, this should be done carefully. The primary aim of the cutoff scores is to initiate discussions within teams. For example, they can look for possible reasons why they perceive the organization of care to be low and what they expect from other team members. They can also learn from actions taken by other teams. We hypothesize that teams that use care bundles, care pathways, or evidence-based protocols will have CPSET scores in the higher percentiles compared with teams that do not use quality improvement initiatives. But further research is needed to determine whether the structured care associated with quality improvement initiatives does indeed change the perception of health care providers according to the actual organization of care.

Some limitations of our study should be considered. One limitation is the risk of social desirability and selection bias. Team leaders and coordinators of care processes decided which specific team members would complete the CPSET. Hence, it is possible that not all team members or health care professionals involved in a specific care process were surveyed. Another concern is that the results of this study are based on data from two countries: Belgium and the Netherlands. Therefore, a comparable study should be conducted in additional countries. The validity of the CPSET is currently being tested in French, Norwegian, Italian, Portuguese, English, and German languages.

Conclusions

The CPSET is a valid and reliable tool for measuring the organization of care as perceived by involved health care providers. Some of the CPSET scores depend on age, gender, and profession. Team leaders can use the CPSET to evaluate how their team members perceive the organization of care. The cutoff scores presented in this study will aid health care managers rank their team, identify differences between teams within a care facility or institution, and analyze the needs of teams in their collaborative search for excellence.

References

  1. Altman DE, Clancy C, Blandon RJ: Improving patient safety – five years after the IOM report. N Eng J Med. 2004, 351: 2041-2043. 10.1056/NEJMp048243.

    Article  CAS  Google Scholar 

  2. Classen DC, Resar R, Griffin F, Federico F, Frankel T, Kimmel N, Whittington JC, Frankel A, Seger A, James BC: ‘Global Trigger Tool’ shows that adverse events in hospitals may be ten times greater than previously measured. Health Aff. 2011, 30: 581-589. 10.1377/hlthaff.2011.0190.

    Article  Google Scholar 

  3. Landrigan CP, Parry GJ, Bones CB, Hackbarth AD, Goldmann DA, Sharek PJ: Temporal trends in rates of patient harm resulting from medical care. N Engl J Med. 2010, 363: 2124-2134. 10.1056/NEJMsa1004404.

    Article  CAS  PubMed  Google Scholar 

  4. Committee on Quality of Health Care in America, Institute of Medicine: Crossing the quality chasm: A new health system for the 21st century. 2001, Washington DC: National Academy Press

    Google Scholar 

  5. Porter ME, Teisberg EO: How physicians can change the future of health care. JAMA. 2007, 297: 1103-1111. 10.1001/jama.297.10.1103.

    Article  CAS  PubMed  Google Scholar 

  6. Chilingerian JA, Glavin MP: Temporary firms in community hospitals: elements of a managerial theory of clinical efficiency. Med Care Rev. 1994, 51: 289-335. 10.1177/107755879405100303.

    Article  CAS  PubMed  Google Scholar 

  7. Deneckere S, Euwema M, Van Herck P, Lodewijckx C, Panella M, Sermeus W, Vanhaecht K: Care Pathways Lead to Better Teamwork: Results of a Systematic Review. Soc Sci Med. 2012, 75: 264-268. 10.1016/j.socscimed.2012.02.060.

    Article  PubMed  Google Scholar 

  8. Manser T: Teamwork and patient safety in dynamic domains of healthcare: a review of the literature. Acta Anaesthesiol Scand. 2009, 53: 143-151. 10.1111/j.1399-6576.2008.01717.x.

    Article  CAS  PubMed  Google Scholar 

  9. Vanhaecht K, De Witte K, Depreitere R, van Zelm R, De Bleser L, Proost K, Sermeus W: Development and validation of a care process self-evaluation tool. Health Serv Manage Res. 2007, 20: 189-202. 10.1258/095148407781395964.

    Article  PubMed  Google Scholar 

  10. Deneckere S, Robyns N, Vanhaecht K, Euwema M, Panella M, Lodewijckx C, Leigheb F, Sermeus W: Indicators for follow-up of multidisciplinary teamwork in care processes: results of an international expert panel. Eval Health Prof. 2011, 34: 258-277. 10.1177/0163278710393736.

    Article  PubMed  Google Scholar 

  11. Vanhaecht K, Van Gerven E, Segal O, Panella M, Sermeus W, Bellemans J, Simon JP: Is variation in the content of care pathways leading to quality and patient safety problems?. Hip Int. 2011, 21: 770-771. 10.5301/HIP.2011.8844.

    Article  PubMed  Google Scholar 

  12. Segal O, Bellemans J, Van Gerven E, Deneckere S, Panella M, Sermeus W, Vanhaecht K: Important variations in the content of care pathway documents for total knee arthroplasty may lead to quality and patient safety problems. J Eval Clin Practice. 2013, 19: 11-15. 10.1111/j.1365-2753.2011.01760.x.

    Article  Google Scholar 

  13. Van Gerven E, Vanhaecht K, Deneckere S, Vleugels A, Sermeus W: Management challenges in care pathways: conclusions of a qualitative study within 57 health care organizations. Int J Care Path. 2010, 14: 142-149. 10.1258/jicp.2010.010029.

    Article  Google Scholar 

  14. Deneckere S, Euwema M, Lodewijckx C, Panella M, Sermeus W, Vanhaecht K: The European Quality of Care Pathways (EQCP) study on the impact of care pathways on interprofessional teamwork in an acute hospital setting: study protocol: for a cluster randomised controlled trail and evaluation of implementation processes. Implement Sci. 2012, 7: 47-http://dx.doi.org/10.1186/1748-5908-7-47.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Garson DC: Structural Equation Modeling. NC State University, [http://www2.chass.ncsu.edu/garson/pa765/structur.htm]

  16. Medsker GJ, Williams LJ, Holahan PJ: A review of current practices for evaluating causal models in organizational behavior and human resources management research. J manage. 1994, 20: 439-464.

    Google Scholar 

  17. Waltz CF, Strickland OL, Lenz ER: Measurement in nursing and health research. 2005, New York: Springer, (3rd edition)

    Google Scholar 

  18. Donner A, Klar N: Design and analysis of cluster randomization trails in health research. 2000, London: Arnold

    Google Scholar 

  19. Thomas EJ, Sexton JB, Helmreich RL: Discrepant attitudes about teamwork among critical care nurses and physicians. Crit Care Med. 2003, 31: 956-959. 10.1097/01.CCM.0000056183.89175.76.

    Article  PubMed  Google Scholar 

  20. Wauben LS, Dekker-Van Doorn CM, van Wijngaarden JD, Goossens RH, Huijsman R, Klein J, Lange JF: Discrepant perceptions of communication, teamwork and situation awareness among surgical team members. Int J Qual Health Care. 2011, 23: 159-166. 10.1093/intqhc/mzq079.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  21. Carney BT, West P, Neily JB, Mills PD, Bagian JP: Improving perceptions of teamwork climate with the Veterans Health Administration medical team training program. Am J Med Qual. 2011, 26: 480-484. 10.1177/1062860611401653.

    Article  PubMed  Google Scholar 

  22. Papathanassoglou ED, Karanikola MN, Kalafati M, Giannakopoulou M, Lemonidou C, Albarran JW: Professional autonomy, collaboration with physicians and moral distress among European intensive care nurses. Am J Crit Care. 2012, 21: 41-52. 10.4037/ajcc2012205.

    Article  Google Scholar 

  23. Nørgaard B, Kofoed PE, Ohm Kyvik K, Ammentorp J: Communication skills training for health care professionals improves the adult orthopaedic patient’s experience of quality of care. Scan J Caring Sci. 2012, 26: 698-704. 10.1111/j.1471-6712.2012.00982.x.

    Article  Google Scholar 

  24. Bates DW, Larizgoitia I, Prasopa-Plaizier N, Jha AK, Research Priority Setting Working Group of the WHO World Alliance for Patient Safety: Global priorities for patient safety research. BMJ. 2009, 338: b1775-10.1136/bmj.b1775.

    Article  PubMed  Google Scholar 

  25. Havens DS, Vasey J, Gittel JH, Lin WT: Relational coordination among nurses and other providers: impact on the quality of care. J Nurs Manag. 2010, 18: 926-937. 10.1111/j.1365-2834.2010.01138.x.

    Article  PubMed  Google Scholar 

  26. O’Leary KJ, Sehgal NL, Terrell G, Williams MV, For the High Performance Teams and the Hospital of the Future Project Team: Interdisciplinary teamwork in hospitals: a review and practical recommendations for improvement. J Hosp Med. 2012, 7: 48-54. 10.1002/jhm.970.

    Article  PubMed  Google Scholar 

  27. Aiken LH, Sermeus W, Van den Heede K, Sloane DM, Busse R, McKee M, Bruyneel L, Rafferty AM, Griffiths P, Moreno-Casbas MT, Tishelman C, Scott A, Brzostek T, Kinnunen J, Schwendimann R, Heinen M, Zikos D, Sjetne IS, Smith HL, Kutney-Lee A: Patient safety, satisfaction, and quality of hospital care: cross sectional surveys of nurses and patients in 12 countries in Europe and the United States. BMJ. 2012, 344: e1717-10.1136/bmj.e1717.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Verhaeghe R, Vlerick P, De Backer G, Van Maele G, Gemmel P: Recurrent changes in the work environment, job resources and distress among nurses: a comparative cross-sectional survey. Int J Nurs Stud. 2008, 45: 382-392. 10.1016/j.ijnurstu.2006.10.003.

    Article  PubMed  Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kris Vanhaecht.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

KV, WS, and MP defined the design of the study and organized the data collection. SD, EVG, and DS coordinated the data cleaning and supported the participating organizations. LB, TM, RCB, and SK performed the statistical analysis. KV, SD, and DS prepared the first draft of the manuscript. All authors discussed the results and approved the final version of the manuscript.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Seys, D., Deneckere, S., Sermeus, W. et al. The Care Process Self-Evaluation Tool: a valid and reliable instrument for measuring care process organization of health care teams. BMC Health Serv Res 13, 325 (2013). https://doi.org/10.1186/1472-6963-13-325

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6963-13-325

Keywords