Our key finding was that hospitals had different ways of arranging performance indicator data collection, and using -and not using- it for internal quality management. The level of formalisation of responsibilities and data collection processes were not in tune with the use for internal quality management activities.
Arrangements for data collection of performance indicators for external accountability
Formal arrangements were made for the tasks and responsibilities in the data collection processes. Medical specialists and nurses were responsible for the registration in patient charts, although some indicated it is a burden to register all required data elements. They felt that every minute spent on administration is a minute spent less on patients. As a result, some medical specialists chose to spend their time predominantly on patients, leading to less complete patient records. Only a few hospitals made arrangements concerning registration completeness.
“For each indicator we appointed someone who is responsible for it, together with their supporting staff.” (quality manager, H3)
The formalisation of procedures was generally achieved by setting up protocols. In these protocols, tasks for data collection are specified, responsibilities are appointed to individuals and the processes are reviewed regularly. For example, one hospital formalised these tasks and responsibilities by adding the names of the responsible persons to certain tasks. Subsequently, when a task appeared to need more attention, the person responsible was easily identified and reminded of that task.
In other hospitals, however, the data collection processes were not formalised. This meant that the quality manager would not have an existing data set at the annual indictor scores submission for external accountability. Quality manager were therefore preoccupied to retrieve the data from patient charts and to calculate the indicator scores in the preceding months. In general, data came from different sources, which make it difficult to collect these data for calculating indicator scores. Subsequently, there were hospitals that decided to report 100 % compliance on several indicators without calculating the actual score.
“At patient level you can assume that a treatment is given when there is a protocol for it. So a 100 % score for that indicator is never checked.” (quality management staff, H10)
“For some of the indicators I report 100 %, because we have protocols for them.” (quality manager, H12)
“Antibiotics is always given prior to incision according to protocol. I’m absolutely certain about that! You cannot continue to the next shackle in the chain without checking if everything is okay. However, I have no idea what our actual performance is on this indicator.” (orthopaedic surgeon, H13)
Using performance indicators for internal quality management
Use for quality management at departmental level (oncology and orthopaedic surgery)
At an oncology department, performance indicator data are used twice a year in a meeting with all employees who were involved in the care for breast cancer patients, and the medical specialists have six meetings a year involving quality performance. When their performance appeared to stay behind, they tried to improve the underlying processes.
“About two years ago we reported an estimated score of 100 % for antibiotics before surgery. When we actually started measuring, it appeared to be 50 %. Since then we improved our procedures and now there is around 90 % compliance.” (quality manager, H11)
However, indicators are not always used for quality management.
“I do not think the collected data play an important role in my work, because I usually know more or less how well things are going; this is how we do things around here.” (oncology surgeon, H14)
A similar perception was observed in other hospitals. In one of the teaching hospitals, the orthopaedic surgeon indicated that one of their research assistants collects data for research projects, but that the findings are not shared with the department before the results are published in a scientific journal. The department does not have performance indicator data other than the research data.
“Our quality manager collects the data once a year. In the meantime, we do not know if we are performing well according to the guidelines. Once we receive the information from the quality manager, and it appears that we perform below par, then nothing changes.” (orthopaedic surgeon, H1)
Use for quality management at hospital level
It was mentioned that indicators were used for quality management at hospital level. Generally, the quality manager draws up a report to get the official approval of the executive board to submit it annually for external accountability. A quality manager indicated that the external accountability lead to a change in the mind set of employees.
“After a low score on a national hospital ranking, our doctors and nurses became more aware of the importance of performance indicators, and they became more cooperative in terms of data registration and collection.” (quality manager, H4)
Other executive boards also used indicators for internal quality management. For example, performance indicators were discussed every three months in a meeting with the department heads.
Factors explaining the differences between hospitals in using performance indicators for internal quality management
Champions as linking pins
In a few cases, employees considered it their duty to collect data and to share it with their colleagues. These ‘champions’ can be considered the linking pins between data collection and its use for quality management activities, i.e. ‘linking pin champions’. One of the interviewed nurses was such a linking pin champion. She was very dedicated to collect the data correctly, so she spent much time after working hours to manually copying specific data from the patient charts into a self-made Excel sheet. Then, she used this Excel sheet to inform the medical specialists at her department about their performance. In the interview, however, she acknowledged that this was not a sustainable situation because it relied solely on her involvement.
“When I would get promoted to another function or transferred to another department, then there is no one who knows where to find the data and no data will be given to the medical specialists.” (nurse at oncology department, H7)
In another hospital, a nurse with an IT background aligned the nurses’ electronic patient record when guidelines were updated. For example, when an extra step was added to a guideline, the nurse added this step into the electronic patient record. And because it was an important step, the nurse made it obligatory to fill it in so that the nurses could not forget to work according the new guideline.
“The nurses are now working with electronic patient records, but the orthopaedic surgeons are still working with paper records. But that is something that I’m working on to change.” (orthopaedic nurse, H11)
A pro-active role of the quality manager
The interviews showed that quality managers carried out their role differently. Some of them merely collected data for the annual reporting of performance indicators for external accountability, while others were more pro-active. There was a quality manager who had a reactive approach.
“On the oncology department the improvements are initiated by some of the employees themselves. My role is just to collect the data.” (quality manager, H14)
Another quality manager was more pro-active and pushed the executive board’s quality agenda.
“I think it’s important that the board knows about our quality performance. Therefore I frequently make an update, print it out and put it on the CEO’s desk.” (quality manager, H13)
The role and position of the quality manager determined their influence on professionals to be held accountable for their quality performance, while influencing the executive board’s quality agenda on the same time. In other words, quality managers either merely pulled data from the patient records for external accountability, or also pushed the quality management agenda. In hospitals where the quality manager was more pro-active, data appeared to be used more systematically for quality management activities, even when data collection arrangements were poor.
Engagement of medical specialists
The use of indicators for quality management at departmental level seemed to largely depend on the engagement of medical specialists.
“Medical specialists in our hospital really feel that they are responsible for the outcomes of the indicators.” (quality manager, H5)
“Registration in patient charts is part of care delivery.” (oncology surgeon, H8)
However, some medical specialists were sceptical about the validity of few indicators. In practice, they only used indicators that were perceived interestingly. For example, one orthopaedic surgeon indicated that the timing of administering antibiotics prior to hip or knee replacement surgery was not relevant:
“This is not a good indicator. When we score poorly on it, then I do not change anything because I do not think this indicator is important. Indicators should focus on results, such as the functionality of the patient one year after surgery.” (orthopaedic surgeon, H1)
“If they [fellow orthopaedic surgeons] do not see the link between indicators and the ‘real’ quality of care, then it is hard to convince them to register the underlying data properly.” (orthopaedic surgeon, H4)
Diversity in data infrastructures
Hospitals are free to develop their own data infrastructure. We observed 14 different types of data infrastructures in 14 different hospitals. Patient records were either paper-based, electronic or a combination of both. Even where patient records were completely electronic, the type of software often differed between departments. Hospitals with a cohesive and homogenous electronic data infrastructure, performance indicator scores could be calculated ‘with a click of a button’. A less robust data infrastructure, however, has consequences for the time and efforts to collect performance indicator data correctly.
“We investigated how to improve the communication between different data systems, and it will cost hundreds of thousands of euros to get it done.” (quality management staff, H2)
“We do not have electronic patient records, so it is difficult to collect data from all the different sources.” (quality management staff, H6)
“In this hospital we have one electronic patient record system. To collect the data we have to write the command in our software and then it is just a matter of ‘a click of a button’.” (quality management staff, H9).
Additionally, the more manual labour is needed to collect the data from different sources, the more chances there are in general for making mistakes.