The treatment of Head and Neck Cancer (HNC) is an example of low volume, highly complex, multidisciplinary integrated care. To monitor and effectively improve high quality integrated care, the Dutch Head and Neck Audit (DHNA) was set up in 2014 (with quality indicators as a basis) to monitor, benchmark and find areas for improvement. This paper gives an overview of the development, first results, and implications.
Quality Indicators (QIs) were developed from three perspectives: Medical specialists, allied health professionals and patients. Data were collected in an online registration system.
Setting up a multidisciplinary quality registration is challenging and time-consuming. Involvement of all health professionals and development of good QIs is crucial. Efforts should be made on national level to solve privacy and juridical restrictions for quality registrations. It is crucial to decrease registration burden, for example with an IT-reliable automatic subtraction system. Although the registration was recently launched, it already visualizes hospital variation in current care. More data are needed to better define case-mix, obtain more insight into long-term Patient Reported Outcomes (PROs) and Patients' Experiences (PREs), and to define the relation between PROs and PREs and patient outcomes such as survival.
The development of a multidisciplinary quality registration from different perspectives is feasible. Obtained experiences in this project can be used to set up other oncological quality registrations. In the upcoming years, more data has to be obtained, enabling more reliable feedback to improve quality of health care for patients with HNC in the Netherlands.
Head and neck, Quality of care, Quality indicator, Patient reported outcomes, Patient reported experiences
HNC: Head and Neck Cancer; QI: Quality Indicator; PRO: Patient Reported Outcome; PRE: Patient Reported Experience; DHNA: Dutch Head and Neck Audit; HANA: Head and Neck Audit (UK); DAHANCA: Danish Head and Neck Cancer Database; DHNS: Dutch Head Neck Society; RIVM: National Institute for Public Health and Environment; ICHOM: International Consortium for Health Outcomes Measurement
The treatment of Head and Neck Cancer (HNC) is an excellent example of low volume, highly complex, multidisciplinary integrated care. HNCs are heterogeneous (both biologically and in clinical behavior) fast-growing tumors in an anatomically and functionally complex area, with multiple invasive treatment opportunities. Several medical specialists and allied health professionals are involved in delivering high quality care to individual patients. To increase the quality of care, coordination is crucial, resulting in less fragmentation and unnecessary replication [1].
To monitor and effectively improve high quality integrated care, a clinical audit defined as "A quality improvement process that seeks to improve patient care and outcomes through systematic review of care against explicit criteria and the implementation of change" can be helpful [2]. The Dutch Head and Neck Audit (DHNA) was set up in 2014 to monitor the quality of integrated HNC care with evidence-based quality indicators as a basis, for benchmarking and finding areas for improvement. Quality indicators were developed from three different perspectives: Medical specialists, allied health professionals and patients. Following one year of inventory and building an online quality registration system [3,4], the first data were collected to fill the indicators in 2015-2016 [5]. The preferences of health professionals and patients in receiving feedback on results were investigated as well [6]. In this paper, the main findings of the project from the perspective of the current status of HNC care in the Netherlands and the implications for clinical practice, future research and policymaking will be described. An example of setting up a similar quality registration was given in a previous published paper [7] (Box 1).
The DHNA is the first quality of health care registration system in the Netherlands involving both medical specialists and allied health professionals, with all indicators agreed upon by patients. In other words, the DHNA is a truly multidisciplinary registration. Most quality registrations are currently monodisciplinary or only involve process indicators from the medical specialists' perspective [8]. A monodisciplinary audit focuses on process performance and patient outcomes from the perspective of one discipline with the aim of improving quality of care. However, multidisciplinary care is nowadays more common, as several disciplines contribute to patient outcomes. A good example of this is the 'swallow function' after a curative treatment for an HNC: This can be 4 influenced by both medical treatment and supportive care of the speech therapist or dietician [9]. Quality is often a result of both.
The DHNA is also one of the first HNC quality registrations on an international level. Other countries already have databases with the aim of improving quality of care and patient outcomes, for example, the Head and Neck Audit (HANA) [10] in the UK, or the Danish Head and Neck Cancer Database (DAHANCA) [11]. These databases, however, were built from an epidemiological perspective for clinical trials and did not use defined evidence-based quality indicators from the start. An epidemiological database is not primarily intended for quality registration from the perspective of process indicators, Patient Reported Outcomes (PROs) and Patients' Experiences (PREs) [12].
The basic assumption for the quality indicator development procedure used for the DHNA was that outcome indicators formed the basis of process and structural indicators. In addition, the indicators were developed from three different perspectives (Figure 1).
The DHA outcome indicators followed the three-tiered hierarchy for value-based healthcare, as developed by Porter [13]. The first level, e.g. survival, is generally the most important, and lower-tier outcomes, e.g. sustainability of health, follow the success of higher tiers. We therefore followed the current trend to focus on outcomes [14], such as disease-specific mortality and survival or PROs.
There are three reasons not to focus solely on outcome indicators. Firstly, process indicators are more sensitive in measuring differences in quality of care [15]. Secondly, a process indicator is easier to interpret, whereas an outcome indicator, for example mortality, is a rather more indirect measure [15]. Thirdly, by only measuring outcome performance, there is no information on how to begin addressing problems. When a hospital discovers poor performance for one particular outcome, the first step is to dissect the outcome into its different components, and to ensure adherence to all best practice recommendations at process level [14]. For these reasons, outcome indicators seem to be a measure of quality of care to a lesser extent [16] and some researchers push the pendulum back towards process measures [14].
Furthermore, the link between process and outcome indicators is often unclear. This is mainly since many data and sufficient follow-up years are necessary to analyze this association. With the DHNA, the link between process and outcome indicators can be analyzed when more data are available in the future.
In the DHNA, the PROs and PREs are requested via online patient questionnaires, the remaining data are recorded by healthcare providers themselves in an online registration system. All collected data are stored in a database. To ensure that data will be analyzed according to current rights and privacy regulations, it was necessary to set up contracts between the HNC centers and the data processors. However, hospitals and their HNC centers appeared to have their own interpretation of legal regulations regarding aspects such as exchange of encrypted data and ownership into account. Unfortunately, the lawyers could not reach consensus. This strikes the need for a uniform regulation about privacy aspects.
The final set of indicators consisted of five outcome indicators (survival, recurrence, complications, PROs and PREs), 13 and 18 process indicators from the medical specialist perspective and the allied health professional perspective respectively, while three structure indicators from the allied health professional perspective were developed within the DHNA. From the patients' perspective, a total of 34 relevant themes of needs and preferences were identified to obtain tools to make current integrated HNC care more patient-centred [4]. The results and usefulness of three indicators will be discussed in this paragraph, namely; 1) Survival: since this is 6 one of the most important outcomes relevant for both patients and professionals; 2) Time to treatment interval: Since patients noted that there is an urgent need to reduce waiting times in the hospital, and 3) PROs: Since these are increasingly used to measure quality of care and provide us with information about how the patients' feels.
Previous (European) studies showed that the survival of HNC patients in the Netherlands is relatively high [17,18]. Compared to other countries in Europe, the Netherlands is one of the best performers on survival. In a EUROCARE-5 population-based study for head and neck cancers diagnosed in the early 2000s for example, five-year survival for patients with laryngeal cancer is 68.9% in the Netherlands as compared to 58.9% in Europe. For patients with oral cancer the difference is similar, namely 56.1% in the Netherlands compared to 45.4% in Europe [18]. This shows that HNC care in the Netherlands is relatively superior with regard to survival, which could be indicative for quality of care. This might be due to the concentration and centralization of HNC care since 1984 under the umbrella of the Dutch Head Neck Society (DHNS) [19]. Monitoring of the quality of integrated HNC care using the DHNA provides opportunities to further explore the association between survival and quality of care.
In the Netherlands, all professional associations related to HNC care agreed that 80% of all new patients should receive their primary treatment within 30 calendar days from the first consultation at an HNC center. However, nationwide, only 48% of the patients start with their treatment within 30 calendar days, with a variation of 20.72% in different HNC centers [5]. A previous study in the Netherlands (2007) shows an average 'time to treatment interval' of 28 days with a variation of 5-95 days between diagnostic and radiotherapy planning scans [20]. So, results have not changed that much in ten years and further improvement is still possible. By visualizing 'time to treatment interval' using the DHNA, and providing active feedback to health professionals, this can be improved in the future.
The Netherlands is one of the countries that seem to be most advanced in implementing PROs, and it also appears to be leading in the way of inclusion of PROs in national registries [21]. Internationally, there is a policy shift towards value-based health care and health outcome evaluations, such as in the UK, US, Sweden and the 7 Netherlands. The first DHNA results concerning the PROs showed that function and symptoms differ between type of treatment, follow-up moments, age and tumor staging [6]. Results and methods used were comparable to other studies [22,23]. Many studies focus on differences in patient-reported outcomes and on what can be done by hospitals and health professionals to increase the effectiveness of using PROs [24]. However, a crucial step in value-based health care is the effectiveness of measuring PROs on patient outcomes such as survival. A good example of this is given by Basch, et al. who recently stated that survival increases significantly in patients who monitored symptoms with PROs compared to patients who received standard care [25]. So far, this is studied to a lesser extent and will be one of the aims of the DHNA for the future.
In general, an audit registration such as the DHNA needs a couple of years of data to provide stable results [26]. As the first data were collected in December 2014, it is too early yet to present results on all indicators. Preliminary results of the DHNA show that, even in a recently launched quality registration, with 2,400 new HNC patients included, variations in the delivery of current processes of care among HNC centers is already visible. Feedback on indicators in the DHNA is given via an automatic online dashboard, which is only accessible by staff at individual HNC centers, who are able to view the score of other HNC centers anonymously along with the average score [27]. This system allows health professionals to easily compare the performance of their own HNC center with a nationwide benchmark, upon which they can start acting on their own results. As mentioned above, variation was shown for time to treatment interval from first consult to start of the treatment. This can be one of the first starting points to share best practices between hospitals towards decreasing time to treatment. Besides transparency within and between hospitals, the first results of the DHNA can also be shown to the public in the upcoming years, in other words: Public transparency.
For most outcome indicators, such as recurrence rates and survival, it takes several years before stable and reliable data are complete enough to be interpreted. However, currently some results of the DHNA are already visible for patients, namely PROs and PREs. Patients can directly see how their results differ from the last time. In addition, they can bring along the results to the medical consult and discuss their concerns or ask for possible solutions. In future, the health professional can also check for outliers or relevant differences compared to a 8 previous consult in his own electronic system, prior to the consult. Together with the patient, they can start acting on the results at an earlier stage, thus improving rehabilitation. Therefore, an automatic feedback loop toward the patient and the health professional and introducing the relevance of PROs in a medical consult can improve quality of care in small steps and earlier on, compared to quality improvements depending on aspects such as recurrence rates and survival.
Although positive effects of audit and feedback in general have been reported, e.g. decreased duration of hospital stay [28] and decreased mortality rates [29], this method of improving quality of care has not been found to be consistently effective [30-33]. Previous research shows that the format of feedback may significantly affect the interpretation of data [34-36]. The DHNA showed that tailored reports of feedback on professional practice and healthcare outcomes are recommended, since feedback preferences differ between medical specialists, allied health professionals, and health insurers [27]. In general, the preferences for receiving feedback differ regarding content but not regarding lay-out. This knowledge gives us tools to improve the effects of audit and feedback by adapting the feedback format and contents to the preferences of stakeholders.
Developing evidence-based indicators for the DHNA from the perspective of allied health professionals proved to be rather a challenge, as there are hardly any (inter)national guidelines that provide evidence-based recommendations for daily healthcare delivery [3]. The indicators were developed in collaboration with the Dutch national foundation for allied health professionals in this specialist sector - the Paramedische Werkgroep Hoofd Halstumoren (PWHHT). Panel members were instructed to discuss the potential indicators with the allied health professionals of their own discipline, in their own center, and in other Dutch HNC centers as well. For some disciplines, variation in delivery of care between the different centers became visible. Therefore, the development of indicators was more of a starting point for debate about how HNC care should be delivered. As a consequence, development of indicators from the allied health perspective took more time to reach agreement compared to medical indicators. Discussion remained for the indicators developed from the perspective of speech therapists. Therefore, new indicators were developed after one year. Overall, to develop evidence-based indicators, evidence-based guidelines or literature are important requirements. However, evidence-based guidelines are not always available for rare diseases. A well-performed consensus procedure is then necessary to develop useful indicators.
When interpreting quality indicator scores, it may be difficult to distinguish between a lack of documentation and actual insufficient adherence to guidelines. For example, if the indicator 'Presence of a case manager or nurse practitioner at the consultation to discuss the treatment plan' does not have a positive score, it could mean that the case manager was not present, or that this was not documented, as such. In addition, to reliably benchmark the performance of one hospital compared to average national performance, it is crucial that all hospitals include all their patients. Otherwise, with only a proportion of patients, it is impossible to calculate a stable indicator, as 1) It is unknown which patients are missing, and 2) Variations in patient numbers can influence the adherence percentage. If this happens for outcome indicators, it might set both the hospital and the national performance at a disadvantage. Therefore, during the first year most registrations merely focus on developing indicators and quality registration; the second year on ensuring that all data will be collected; while in the third year the first results are anonymously presented.
A couple of wise lessons were learned from this project and explained in this paragraph.
Setting up a multidisciplinary quality registration is quite challenging and time-consuming. For a successful quality registration, it is important that it is set up for and through health professionals [26]. The DHNA is an example of a quality registration in which the health insurer was involved as a partner, next to the health professionals and the patients, when developing the quality registration. This is also called 'tripartite'.
The website of the National Institute for Public Health and Environment - the 'Rijksinstituut voor Volksgezondheid en Milieu' (RIVM), in the Netherlands states that there are currently 181 active quality registrations, while annual costs for quality registrations are estimated at 80 million Euros [37,38]. These costs are currently spent on registrations and not directly on the patient. One way to decrease the registration burden and the associated costs is to reduce the number of registrations and to make the quality registrations as comprehensive as possible. Another method to decrease the registration burden is to automate data subtraction from hospital electronic patient records. To build an IT-reliable automatic subtraction system is, of course, initially expensive, but not in the long run. Furthermore, Govaerts, et al. show that improved outcomes due to auditing can also reduce costs [39].
Besides public transparency, the key to achieving improvement collaboratively is to share results within a hospital or between hospitals. Therefore, it is crucial to present the results in such a way that they support collaborative improvement, but also represent a safe platform to share results. Moreover, the method of communication about this kind of non-public transparency is important and should not be neglected.
The data from the DHNA provides the first opportunity to visualize differences in outcomes and practice performance at a national level. With this information, best practices can be framed, and ultimately, data can be used to improve quality of care. From the first results of the DHNA, we know that variation between centers is present, and that four patient and hospital determinants influenced the indicator scores [5]. When more data are available, research can be directed towards all indicators to explore the variance and possible patient and hospital determinants.
The DHNA shows that patients with multimodality treatments experience a less well-organized healthcare process, suffer from lower functional scores and more symptoms compared to patients with a mono-modality treatment [6]. In the future, more data should be collected to obtain more insight into long-term quality of life and patients' experiences. Apart from carrying out research on the outcomes, studies towards the effectiveness of measuring PROs should be undertaken to increase patient outcomes such as survival.
The first quality indicators for the DHNA were developed in the start-up phase of the quality registration. After three years, more data will be obtained, enabling more reliable feedback on a national level and a hospital level. This gives us tools to carefully evaluate the indicators.
Besides comparing quality of care between different HNC centers in the Netherlands, it would be interesting to compare the quality of HNC care to other countries in Europe [18]. An already existing consortium is the 'International Consortium for Health Outcomes Measurement' (ICHOM), which measures international patient outcomes. At this moment, no international HNC indicator set exists under the umbrella of ICHOM.
Development of good quality indicators for future registrations
It was difficult to develop evidence-based indicators from the perspective of allied health professionals in the DHNA, mainly because there were no national guidelines [3]. Campbell, et al. have previously described that evidence-based quality indicators form the foundation for a good quality registration, preferably developed by an evidence-based method [3,40,41]. However, the results from the quality registration could provide the first tools to discuss where and why HNC care is delivered differently in order to reach consensus about best practice. Quality indicators themselves can therefore be the evidence to improve clinical practice and, therefore, reframe national guidelines.
Hospitals and their HNC centers have their own interpretation of legal regulations regarding aspects such as exchange of encrypted data and ownership into account. Such problems are encountered on a local level, yet require a solution on a national level. Hopefully a uniform regulation on a national level will follow in the future.
The DHNA is the first quality of health care registration system in the Netherlands that involves both medical specialists, allied health professionals, and patients. In addition, it is also one of the first HNC quality registrations (based on evidence-based quality indicators) on an international level. Outcome indicators formed the basis of process and structural indicators, and all indicators are evidence-based. A key element to implement an efficient HNC registration was to keep the health professionals involved, and good developed quality indicators.
In the future, more data are needed to better explain the variation and possible patient and hospital determinants, to obtain more insight into long-term quality of life and patients' experiences, and to define the relation between PROs and PREs and patient outcomes such as survival. Hereafter, results can be shared within a hospital or between hospitals to support collaborative improvement. When hospitals give permission, data can become transparent to the public as well.
Efforts should be made on a national level to solve privacy and juridical restrictions for quality registrations. In addition, the registration load should be decreased with the use of IT-reliable automatic subtraction systems. With more data and a reduction of the registration load, the focus of the DHNA will move from registration of data to improving quality of HNC care (Box 1).
The study sponsor CZ health insurer did not have any role in study design and the collection, analysis, and interpretation of data, nor in writing the article and the decision to submit it for publication; besides the researchers were independent from funders and sponsors.
An overview of the development, first results, and implications for clinical practice, future research and policymaking of the Dutch Head and Neck Audit Group.