Journal of Head and Neck Surgery

ISSN: 2689-8713

RESEARCH ARTICLE | VOLUME 1 | ISSUE 1 | DOI: 10.36959/605/528 OPEN ACCESS

The Dutch Head and Neck Audit: The First Steps

Lydia FJ Van Overveld, Robert P. Takes, Ludi E Smeele, Matthias AW Merkx, Rosella PMG Hermens and Dutch Head and Neck Audit Group

  • Lydia FJ Van Overveld 1*
  • Robert P. Takes 2
  • Ludi E Smeele 3,4
  • Matthias AW Merkx 5
  • Rosella PMG Hermens 1
  • Dutch Head and Neck Audit Group
  • Radboud University Medical Center, Radboud Institute for Health Sciences, Scientific Center for Quality of Healthcare, The Netherlands
  • Department of Otolaryngology - Head and Neck Surgery, Radboud University Medical Center, Radboud Institute for Health Sciences, The Netherlands
  • Department of Oral and Maxillofacial Surgery, Academisch Medisch Centrum, The Netherlands
  • Department of Head and Neck Surgery and Oncology, Antoni van Leeuwenhoek Nederlands Kanker Instituut, The Netherlands
  • Department of Oral and Maxillofacial Surgery, Radboud University Medical Centre, Radboud Institute for Health Sciences, The Netherlands

van Overveld LFJ, Takes RP, Smeele LE, et al. (2018) The Dutch Head and Neck Audit: The First Steps. J Head Neck Surg 1(1):1-8.

Accepted: January 25, 2018 | Published Online: January 27, 2018

The Dutch Head and Neck Audit: The First Steps

Abstract


Objective

The treatment of Head and Neck Cancer (HNC) is an example of low volume, highly complex, multidisciplinary integrated care. To monitor and effectively improve high quality integrated care, the Dutch Head and Neck Audit (DHNA) was set up in 2014 (with quality indicators as a basis) to monitor, benchmark and find areas for improvement. This paper gives an overview of the development, first results, and implications.

Methods

Quality Indicators (QIs) were developed from three perspectives: Medical specialists, allied health professionals and patients. Data were collected in an online registration system.

Results

Setting up a multidisciplinary quality registration is challenging and time-consuming. Involvement of all health professionals and development of good QIs is crucial. Efforts should be made on national level to solve privacy and juridical restrictions for quality registrations. It is crucial to decrease registration burden, for example with an IT-reliable automatic subtraction system. Although the registration was recently launched, it already visualizes hospital variation in current care. More data are needed to better define case-mix, obtain more insight into long-term Patient Reported Outcomes (PROs) and Patients' Experiences (PREs), and to define the relation between PROs and PREs and patient outcomes such as survival.

Conclusion

The development of a multidisciplinary quality registration from different perspectives is feasible. Obtained experiences in this project can be used to set up other oncological quality registrations. In the upcoming years, more data has to be obtained, enabling more reliable feedback to improve quality of health care for patients with HNC in the Netherlands.

Keywords


Head and neck, Quality of care, Quality indicator, Patient reported outcomes, Patient reported experiences

Abbreviations


HNC: Head and Neck Cancer; QI: Quality Indicator; PRO: Patient Reported Outcome; PRE: Patient Reported Experience; DHNA: Dutch Head and Neck Audit; HANA: Head and Neck Audit (UK); DAHANCA: Danish Head and Neck Cancer Database; DHNS: Dutch Head Neck Society; RIVM: National Institute for Public Health and Environment; ICHOM: International Consortium for Health Outcomes Measurement

Background


The treatment of Head and Neck Cancer (HNC) is an excellent example of low volume, highly complex, multidisciplinary integrated care. HNCs are heterogeneous (both biologically and in clinical behavior) fast-growing tumors in an anatomically and functionally complex area, with multiple invasive treatment opportunities. Several medical specialists and allied health professionals are involved in delivering high quality care to individual patients. To increase the quality of care, coordination is crucial, resulting in less fragmentation and unnecessary replication [1].

To monitor and effectively improve high quality integrated care, a clinical audit defined as "A quality improvement process that seeks to improve patient care and outcomes through systematic review of care against explicit criteria and the implementation of change" can be helpful [2]. The Dutch Head and Neck Audit (DHNA) was set up in 2014 to monitor the quality of integrated HNC care with evidence-based quality indicators as a basis, for benchmarking and finding areas for improvement. Quality indicators were developed from three different perspectives: Medical specialists, allied health professionals and patients. Following one year of inventory and building an online quality registration system [3,4], the first data were collected to fill the indicators in 2015-2016 [5]. The preferences of health professionals and patients in receiving feedback on results were investigated as well [6]. In this paper, the main findings of the project from the perspective of the current status of HNC care in the Netherlands and the implications for clinical practice, future research and policymaking will be described. An example of setting up a similar quality registration was given in a previous published paper [7] (Box 1).

Characteristics of the DHNA


One of a kind

The DHNA is the first quality of health care registration system in the Netherlands involving both medical specialists and allied health professionals, with all indicators agreed upon by patients. In other words, the DHNA is a truly multidisciplinary registration. Most quality registrations are currently monodisciplinary or only involve process indicators from the medical specialists' perspective [8]. A monodisciplinary audit focuses on process performance and patient outcomes from the perspective of one discipline with the aim of improving quality of care. However, multidisciplinary care is nowadays more common, as several disciplines contribute to patient outcomes. A good example of this is the 'swallow function' after a curative treatment for an HNC: This can be 4 influenced by both medical treatment and supportive care of the speech therapist or dietician [9]. Quality is often a result of both.

The DHNA is also one of the first HNC quality registrations on an international level. Other countries already have databases with the aim of improving quality of care and patient outcomes, for example, the Head and Neck Audit (HANA) [10] in the UK, or the Danish Head and Neck Cancer Database (DAHANCA) [11]. These databases, however, were built from an epidemiological perspective for clinical trials and did not use defined evidence-based quality indicators from the start. An epidemiological database is not primarily intended for quality registration from the perspective of process indicators, Patient Reported Outcomes (PROs) and Patients' Experiences (PREs) [12].

Selection method based on both process and outcome indicators

The basic assumption for the quality indicator development procedure used for the DHNA was that outcome indicators formed the basis of process and structural indicators. In addition, the indicators were developed from three different perspectives (Figure 1).

The DHA outcome indicators followed the three-tiered hierarchy for value-based healthcare, as developed by Porter [13]. The first level, e.g. survival, is generally the most important, and lower-tier outcomes, e.g. sustainability of health, follow the success of higher tiers. We therefore followed the current trend to focus on outcomes [14], such as disease-specific mortality and survival or PROs.

There are three reasons not to focus solely on outcome indicators. Firstly, process indicators are more sensitive in measuring differences in quality of care [15]. Secondly, a process indicator is easier to interpret, whereas an outcome indicator, for example mortality, is a rather more indirect measure [15]. Thirdly, by only measuring outcome performance, there is no information on how to begin addressing problems. When a hospital discovers poor performance for one particular outcome, the first step is to dissect the outcome into its different components, and to ensure adherence to all best practice recommendations at process level [14]. For these reasons, outcome indicators seem to be a measure of quality of care to a lesser extent [16] and some researchers push the pendulum back towards process measures [14].

Furthermore, the link between process and outcome indicators is often unclear. This is mainly since many data and sufficient follow-up years are necessary to analyze this association. With the DHNA, the link between process and outcome indicators can be analyzed when more data are available in the future.

Privacy and juridical challenges for collecting patient data

In the DHNA, the PROs and PREs are requested via online patient questionnaires, the remaining data are recorded by healthcare providers themselves in an online registration system. All collected data are stored in a database. To ensure that data will be analyzed according to current rights and privacy regulations, it was necessary to set up contracts between the HNC centers and the data processors. However, hospitals and their HNC centers appeared to have their own interpretation of legal regulations regarding aspects such as exchange of encrypted data and ownership into account. Unfortunately, the lawyers could not reach consensus. This strikes the need for a uniform regulation about privacy aspects.

Current Quality of Head and Neck Cancer Care in the Netherlands


The final set of indicators consisted of five outcome indicators (survival, recurrence, complications, PROs and PREs), 13 and 18 process indicators from the medical specialist perspective and the allied health professional perspective respectively, while three structure indicators from the allied health professional perspective were developed within the DHNA. From the patients' perspective, a total of 34 relevant themes of needs and preferences were identified to obtain tools to make current integrated HNC care more patient-centred [4]. The results and usefulness of three indicators will be discussed in this paragraph, namely; 1) Survival: since this is 6 one of the most important outcomes relevant for both patients and professionals; 2) Time to treatment interval: Since patients noted that there is an urgent need to reduce waiting times in the hospital, and 3) PROs: Since these are increasingly used to measure quality of care and provide us with information about how the patients' feels.

Survival

Previous (European) studies showed that the survival of HNC patients in the Netherlands is relatively high [17,18]. Compared to other countries in Europe, the Netherlands is one of the best performers on survival. In a EUROCARE-5 population-based study for head and neck cancers diagnosed in the early 2000s for example, five-year survival for patients with laryngeal cancer is 68.9% in the Netherlands as compared to 58.9% in Europe. For patients with oral cancer the difference is similar, namely 56.1% in the Netherlands compared to 45.4% in Europe [18]. This shows that HNC care in the Netherlands is relatively superior with regard to survival, which could be indicative for quality of care. This might be due to the concentration and centralization of HNC care since 1984 under the umbrella of the Dutch Head Neck Society (DHNS) [19]. Monitoring of the quality of integrated HNC care using the DHNA provides opportunities to further explore the association between survival and quality of care.

Time to treatment interval

In the Netherlands, all professional associations related to HNC care agreed that 80% of all new patients should receive their primary treatment within 30 calendar days from the first consultation at an HNC center. However, nationwide, only 48% of the patients start with their treatment within 30 calendar days, with a variation of 20.72% in different HNC centers [5]. A previous study in the Netherlands (2007) shows an average 'time to treatment interval' of 28 days with a variation of 5-95 days between diagnostic and radiotherapy planning scans [20]. So, results have not changed that much in ten years and further improvement is still possible. By visualizing 'time to treatment interval' using the DHNA, and providing active feedback to health professionals, this can be improved in the future.

Patient Reported Outcomes (PRO)

The Netherlands is one of the countries that seem to be most advanced in implementing PROs, and it also appears to be leading in the way of inclusion of PROs in national registries [21]. Internationally, there is a policy shift towards value-based health care and health outcome evaluations, such as in the UK, US, Sweden and the 7 Netherlands. The first DHNA results concerning the PROs showed that function and symptoms differ between type of treatment, follow-up moments, age and tumor staging [6]. Results and methods used were comparable to other studies [22,23]. Many studies focus on differences in patient-reported outcomes and on what can be done by hospitals and health professionals to increase the effectiveness of using PROs [24]. However, a crucial step in value-based health care is the effectiveness of measuring PROs on patient outcomes such as survival. A good example of this is given by Basch, et al. who recently stated that survival increases significantly in patients who monitored symptoms with PROs compared to patients who received standard care [25]. So far, this is studied to a lesser extent and will be one of the aims of the DHNA for the future.

Current Tools for Improving Quality of Healthcare


Variation in delivered HNC care already visible

In general, an audit registration such as the DHNA needs a couple of years of data to provide stable results [26]. As the first data were collected in December 2014, it is too early yet to present results on all indicators. Preliminary results of the DHNA show that, even in a recently launched quality registration, with 2,400 new HNC patients included, variations in the delivery of current processes of care among HNC centers is already visible. Feedback on indicators in the DHNA is given via an automatic online dashboard, which is only accessible by staff at individual HNC centers, who are able to view the score of other HNC centers anonymously along with the average score [27]. This system allows health professionals to easily compare the performance of their own HNC center with a nationwide benchmark, upon which they can start acting on their own results. As mentioned above, variation was shown for time to treatment interval from first consult to start of the treatment. This can be one of the first starting points to share best practices between hospitals towards decreasing time to treatment. Besides transparency within and between hospitals, the first results of the DHNA can also be shown to the public in the upcoming years, in other words: Public transparency.

Results already visible for patients

For most outcome indicators, such as recurrence rates and survival, it takes several years before stable and reliable data are complete enough to be interpreted. However, currently some results of the DHNA are already visible for patients, namely PROs and PREs. Patients can directly see how their results differ from the last time. In addition, they can bring along the results to the medical consult and discuss their concerns or ask for possible solutions. In future, the health professional can also check for outliers or relevant differences compared to a 8 previous consult in his own electronic system, prior to the consult. Together with the patient, they can start acting on the results at an earlier stage, thus improving rehabilitation. Therefore, an automatic feedback loop toward the patient and the health professional and introducing the relevance of PROs in a medical consult can improve quality of care in small steps and earlier on, compared to quality improvements depending on aspects such as recurrence rates and survival.

Feedback

Although positive effects of audit and feedback in general have been reported, e.g. decreased duration of hospital stay [28] and decreased mortality rates [29], this method of improving quality of care has not been found to be consistently effective [30-33]. Previous research shows that the format of feedback may significantly affect the interpretation of data [34-36]. The DHNA showed that tailored reports of feedback on professional practice and healthcare outcomes are recommended, since feedback preferences differ between medical specialists, allied health professionals, and health insurers [27]. In general, the preferences for receiving feedback differ regarding content but not regarding lay-out. This knowledge gives us tools to improve the effects of audit and feedback by adapting the feedback format and contents to the preferences of stakeholders.

Methodological considerations

Challenges in developing indicators

Developing evidence-based indicators for the DHNA from the perspective of allied health professionals proved to be rather a challenge, as there are hardly any (inter)national guidelines that provide evidence-based recommendations for daily healthcare delivery [3]. The indicators were developed in collaboration with the Dutch national foundation for allied health professionals in this specialist sector - the Paramedische Werkgroep Hoofd Halstumoren (PWHHT). Panel members were instructed to discuss the potential indicators with the allied health professionals of their own discipline, in their own center, and in other Dutch HNC centers as well. For some disciplines, variation in delivery of care between the different centers became visible. Therefore, the development of indicators was more of a starting point for debate about how HNC care should be delivered. As a consequence, development of indicators from the allied health perspective took more time to reach agreement compared to medical indicators. Discussion remained for the indicators developed from the perspective of speech therapists. Therefore, new indicators were developed after one year. Overall, to develop evidence-based indicators, evidence-based guidelines or literature are important requirements. However, evidence-based guidelines are not always available for rare diseases. A well-performed consensus procedure is then necessary to develop useful indicators.

Interpreting results

When interpreting quality indicator scores, it may be difficult to distinguish between a lack of documentation and actual insufficient adherence to guidelines. For example, if the indicator 'Presence of a case manager or nurse practitioner at the consultation to discuss the treatment plan' does not have a positive score, it could mean that the case manager was not present, or that this was not documented, as such. In addition, to reliably benchmark the performance of one hospital compared to average national performance, it is crucial that all hospitals include all their patients. Otherwise, with only a proportion of patients, it is impossible to calculate a stable indicator, as 1) It is unknown which patients are missing, and 2) Variations in patient numbers can influence the adherence percentage. If this happens for outcome indicators, it might set both the hospital and the national performance at a disadvantage. Therefore, during the first year most registrations merely focus on developing indicators and quality registration; the second year on ensuring that all data will be collected; while in the third year the first results are anonymously presented.

Future Perspectives


A couple of wise lessons were learned from this project and explained in this paragraph.

Implications for clinical practice

Increase support of health professionals

Setting up a multidisciplinary quality registration is quite challenging and time-consuming. For a successful quality registration, it is important that it is set up for and through health professionals [26]. The DHNA is an example of a quality registration in which the health insurer was involved as a partner, next to the health professionals and the patients, when developing the quality registration. This is also called 'tripartite'.

Registering at the source

The website of the National Institute for Public Health and Environment - the 'Rijksinstituut voor Volksgezondheid en Milieu' (RIVM), in the Netherlands states that there are currently 181 active quality registrations, while annual costs for quality registrations are estimated at 80 million Euros [37,38]. These costs are currently spent on registrations and not directly on the patient. One way to decrease the registration burden and the associated costs is to reduce the number of registrations and to make the quality registrations as comprehensive as possible. Another method to decrease the registration burden is to automate data subtraction from hospital electronic patient records. To build an IT-reliable automatic subtraction system is, of course, initially expensive, but not in the long run. Furthermore, Govaerts, et al. show that improved outcomes due to auditing can also reduce costs [39].

Transparency

Besides public transparency, the key to achieving improvement collaboratively is to share results within a hospital or between hospitals. Therefore, it is crucial to present the results in such a way that they support collaborative improvement, but also represent a safe platform to share results. Moreover, the method of communication about this kind of non-public transparency is important and should not be neglected.

Implications for future research

Improve quality of care

The data from the DHNA provides the first opportunity to visualize differences in outcomes and practice performance at a national level. With this information, best practices can be framed, and ultimately, data can be used to improve quality of care. From the first results of the DHNA, we know that variation between centers is present, and that four patient and hospital determinants influenced the indicator scores [5]. When more data are available, research can be directed towards all indicators to explore the variance and possible patient and hospital determinants.

Patient reported outcomes and experiences

The DHNA shows that patients with multimodality treatments experience a less well-organized healthcare process, suffer from lower functional scores and more symptoms compared to patients with a mono-modality treatment [6]. In the future, more data should be collected to obtain more insight into long-term quality of life and patients' experiences. Apart from carrying out research on the outcomes, studies towards the effectiveness of measuring PROs should be undertaken to increase patient outcomes such as survival.

Evaluation of quality indicators of the DHNA

The first quality indicators for the DHNA were developed in the start-up phase of the quality registration. After three years, more data will be obtained, enabling more reliable feedback on a national level and a hospital level. This gives us tools to carefully evaluate the indicators.

Go global

Besides comparing quality of care between different HNC centers in the Netherlands, it would be interesting to compare the quality of HNC care to other countries in Europe [18]. An already existing consortium is the 'International Consortium for Health Outcomes Measurement' (ICHOM), which measures international patient outcomes. At this moment, no international HNC indicator set exists under the umbrella of ICHOM.

Implications for policy making

Development of good quality indicators for future registrations

It was difficult to develop evidence-based indicators from the perspective of allied health professionals in the DHNA, mainly because there were no national guidelines [3]. Campbell, et al. have previously described that evidence-based quality indicators form the foundation for a good quality registration, preferably developed by an evidence-based method [3,40,41]. However, the results from the quality registration could provide the first tools to discuss where and why HNC care is delivered differently in order to reach consensus about best practice. Quality indicators themselves can therefore be the evidence to improve clinical practice and, therefore, reframe national guidelines.

Privacy and juridical restrictions

Hospitals and their HNC centers have their own interpretation of legal regulations regarding aspects such as exchange of encrypted data and ownership into account. Such problems are encountered on a local level, yet require a solution on a national level. Hopefully a uniform regulation on a national level will follow in the future.

Final conclusion

The DHNA is the first quality of health care registration system in the Netherlands that involves both medical specialists, allied health professionals, and patients. In addition, it is also one of the first HNC quality registrations (based on evidence-based quality indicators) on an international level. Outcome indicators formed the basis of process and structural indicators, and all indicators are evidence-based. A key element to implement an efficient HNC registration was to keep the health professionals involved, and good developed quality indicators.

In the future, more data are needed to better explain the variation and possible patient and hospital determinants, to obtain more insight into long-term quality of life and patients' experiences, and to define the relation between PROs and PREs and patient outcomes such as survival. Hereafter, results can be shared within a hospital or between hospitals to support collaborative improvement. When hospitals give permission, data can become transparent to the public as well.

Efforts should be made on a national level to solve privacy and juridical restrictions for quality registrations. In addition, the registration load should be decreased with the use of IT-reliable automatic subtraction systems. With more data and a reduction of the registration load, the focus of the DHNA will move from registration of data to improving quality of HNC care (Box 1).

Role of the Funding Source


The study sponsor CZ health insurer did not have any role in study design and the collection, analysis, and interpretation of data, nor in writing the article and the decision to submit it for publication; besides the researchers were independent from funders and sponsors.

Précis


An overview of the development, first results, and implications for clinical practice, future research and policymaking of the Dutch Head and Neck Audit Group.

References


  1. Morris AM (2015) Putting the integration into integrated health care systems. J Clin Oncol 33: 821-822.
  2. National Health Service (NHS) (2002) Principles of best clinical practice in clinical audit. Radcliffe Medical Press Ltd, Abingdon, UK.
  3. van Overveld LFJ, Braspenning JCC, Hermens RPMG (2017) Quality indicators of integrated care for patients with head and neck cancer. Clin Otolaryngol 42: 322-329.
  4. van Overveld LFJ, Takes RP, Turan AS, et al. (2017) Needs and preferences of patients with head and neck cancer in integrated care. Clin Otolaryngol.
  5. Van Overveld LFJ, Takes RP, Braspenning JCC, et al. (2017) Variation in integrated head and neck cancer care: impact of patient and hospital characteristics.
  6. Van Overveld LFJ, van Hoogstraten LMC, Takes RP, et al. (2017) Patient-reported outcomes and experiences in Dutch integrated head and neck cancer care.
  7. Van Leersum NJ, Snijders HS, Henneman D, et al. (2013) The Dutch surgical colorectal audit. Eur J Surg Oncol 39: 1063-1070.
  8. Dutch Institute for Clinical Audit (DICA).
  9. van den Berg MG, Rasmussen-Conrad EL, Wei KH, et al. (2010) Comparison of the effect of individual dietary counselling and of standard nutritional care on weight loss in patients with head and neck cancer undergoing radiotherapy. Br J Nutr 104: 872-877.
  10. Head and Neck Audit (HANA).
  11. Danish Head and Neck Cancer (DAHANCA).
  12. Schmidt M, Schmidt SA, Sandegaard JL, et al. (2015) The Danish National Patient Registry: A review of content, data quality, and research potential. Clin Epidemiol 7: 449-490.
  13. Porter ME, Teisberg EO (2007) How physicians can change the future of health care. JAMA 297: 1103-1111.
  14. Bilimoria KY (2015) Facilitating quality improvement: Pushing the pendulum back toward process measures. JAMA 314: 1333-1334.
  15. Mant J (2001) Process versus outcome indicators in the assessment of quality of health care. International Journal for Quality in Health Care 13: 475-480.
  16. Brook RH, McGlynn EA, Shekelle PG (2000) Defining and measuring quality of care: A perspective from US researchers. Int J Qual Health Care 12: 281-295.
  17. Zigon G, Berrino F, Gatta G, et al. (2011) Prognoses for head and neck cancers in Europe diagnosed in 1995-1999: A population-based study. Ann Oncol 22: 165-174.
  18. Gatta G, Botta L, Sanchez MJ, et al. (2015) Prognoses and improvement for head and neck cancers diagnosed in Europe in early 2000s: The EUROCARE-5 population-based study. Eur J Cancer 51: 2130-2143.
  19. Dutch Head and Neck Society (NWHHT).
  20. Jensen AR, Nellemann HM, Overgaard J (2007) Tumor progression in waiting time for radiotherapy in head and neck cancer. Radiother Oncol 84: 5-10.
  21. Williams K, Sansoni J, Morris D, et al. (2016) Patient-Reported outcome measures. Australian Commission on Safety and Quality in Health Care, Sydney.
  22. Sherman AC, Simonton S, Adams DC, et al. (2000) Assessing quality of life in patients with head and neck cancer: cross-validation of the European Organization for Research and Treatment of Cancer (EORTC) Quality of Life Head and Neck module (QLQ-H&N35). Arch Otolaryngol Head Neck Surg 126: 459-467.
  23. Ramaekers BL, Joore MA, Grutters JP, et al. (2011) The impact of late treatment-toxicity on generic health-related quality of life in head and neck cancer patients after radiotherapy. Oral Oncol 47: 768-774.
  24. Greenhalgh J, Dalkin S, Gooding K, et al. (2017) Functionality and feedback: A realist synthesis of the collation, interpretation and utilisation of patient-reported outcome measures data to improve patient care. Health Services and Delivery Research.
  25. Basch E, Deal AM, Dueck AC, et al. (2017) Overall survival results of a trial assessing patient-reported outcomes for symptom monitoring during routine cancer treatment. JAMA 318: 197-198.
  26. Breedveld F, Grol R, Hoeksem J, et al. (2012) Kwalititeitregistraties als basis voor verbeteren en vertrouwen. NFU.
  27. van Overveld LFJ, Takes RP, Vijn TW, et al. (2017) Feedback preferences of patients, professionals and health insurers in integrated head and neck cancer care. Health Expect 20: 1275-1288.
  28. Landis-Lewis Z, Brehaut JC, Hochheiser H, et al. (2015) Computer-supported feedback message tailoring: Theory-informed adaptation of clinical audit and feedback for learning and behavior change. Implement Sci 10: 12.
  29. Wright J, Dugdale B, Hammond I, et al. (2006) Learning from death: A hospital mortality reduction programme. J R Soc Med 99: 303-308.
  30. Jamtvedt G, Young JM, Kristoffersen DT, et al. (2006) Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care 15: 433-436.
  31. Jamtvedt G, Young JM, Kristoffersen DT, et al. (2006) Audit and feedback: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev.
  32. Baker R, Camosso-Stefinovic J, Gillies C, et al. (2010) Tailored interventions to overcome identified barriers to change: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev.
  33. Foy R, Eccles MP, Jamtvedt G, et al. (2005) What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review. BMC Health Serv Res 5: 50.
  34. Damman OC, De Jong A, Hibbard JH, et al. (2015) Making comparative performance information more comprehensible: An experimental evaluation of the impact of formats on consumer understanding. BMJ Qual Saf 25: 860-869.
  35. Brundage M, Feldman-Stewart D, Leis A, et al. (2005) Communicating quality of life information to cancer patients: A study of six presentation formats. J Clin Oncol 23: 6949-6956.
  36. Macdonaldross M (1977) How Numbers Are Shown - Review of Research on Presentation of Quantitative Data in Texts. Av Comm Rev 25: 359-409.
  37. Nederlandse Vereniging voor Ziekenhuizen (NVZ), KPMG (2015) Onderzoek kosten kwaliteitsmetingen. 1-32.
  38. Kringos DS, Horenberg F, Bal R, et al. (2016) Afwegingen voor de maatschappelijke relevantie vankwaliteitsregistraties. Academisch Medisch Centrum (AMC): Amsterdam.
  39. Govaert JA, van Dijk WA, Fiocco M, et al. (2016) Nationwide outcomes measurement in colorectal cancer surgery: Improving quality and reducing costs. J Am Coll Surg 222: 19-29.e2.
  40. Campbell SM, Braspenning J, Hutchinson A, et al. (2003) Research methods used in developing and applying quality indicators in primary care. BMJ 326: 816-819.
  41. Kathryn Fitch, Steven J Bernstein, Maria Dolores Aguilar, et al. (2000) The RAND/UCLA appropriateness method user's manual. Santa Monica: RAND.

Abstract


Objective

The treatment of Head and Neck Cancer (HNC) is an example of low volume, highly complex, multidisciplinary integrated care. To monitor and effectively improve high quality integrated care, the Dutch Head and Neck Audit (DHNA) was set up in 2014 (with quality indicators as a basis) to monitor, benchmark and find areas for improvement. This paper gives an overview of the development, first results, and implications.

Methods

Quality Indicators (QIs) were developed from three perspectives: medical specialists, allied health professionals and patients. Data were collected in an online registration system.

Results

Setting up a multidisciplinary quality registration is challenging and time-consuming. Involvement of all health professionals and development of good QIs is crucial. Efforts should be made on national level to solve privacy and juridical restrictions for quality registrations. It is crucial to decrease registration burden, for example with an IT-reliable automatic subtraction system. Although the registration was recently launched, it already visualizes hospital variation in current care. More data are needed to better define case-mix, obtain more insight into long-term Patient Reported Outcomes (PROs) and Patients' Experiences (PREs), and to define the relation between PROs and PREs and patient outcomes such as survival.

Conclusion

The development of a multidisciplinary quality registration from different perspectives is feasible. Obtained experiences in this project can be used to set up other oncological quality registrations. In the upcoming years, more data has to be obtained, enabling more reliable feedback to improve quality of health care for patients with HNC in the Netherlands.

References

  1. Morris AM (2015) Putting the integration into integrated health care systems. J Clin Oncol 33: 821-822.
  2. National Health Service (NHS) (2002) Principles of best clinical practice in clinical audit. Radcliffe Medical Press Ltd, Abingdon, UK.
  3. van Overveld LFJ, Braspenning JCC, Hermens RPMG (2017) Quality indicators of integrated care for patients with head and neck cancer. Clin Otolaryngol 42: 322-329.
  4. van Overveld LFJ, Takes RP, Turan AS, et al. (2017) Needs and preferences of patients with head and neck cancer in integrated care. Clin Otolaryngol.
  5. Van Overveld LFJ, Takes RP, Braspenning JCC, et al. (2017) Variation in integrated head and neck cancer care: impact of patient and hospital characteristics.
  6. Van Overveld LFJ, van Hoogstraten LMC, Takes RP, et al. (2017) Patient-reported outcomes and experiences in Dutch integrated head and neck cancer care.
  7. Van Leersum NJ, Snijders HS, Henneman D, et al. (2013) The Dutch surgical colorectal audit. Eur J Surg Oncol 39: 1063-1070.
  8. Dutch Institute for Clinical Audit (DICA).
  9. van den Berg MG, Rasmussen-Conrad EL, Wei KH, et al. (2010) Comparison of the effect of individual dietary counselling and of standard nutritional care on weight loss in patients with head and neck cancer undergoing radiotherapy. Br J Nutr 104: 872-877.
  10. Head and Neck Audit (HANA).
  11. Danish Head and Neck Cancer (DAHANCA).
  12. Schmidt M, Schmidt SA, Sandegaard JL, et al. (2015) The Danish National Patient Registry: A review of content, data quality, and research potential. Clin Epidemiol 7: 449-490.
  13. Porter ME, Teisberg EO (2007) How physicians can change the future of health care. JAMA 297: 1103-1111.
  14. Bilimoria KY (2015) Facilitating quality improvement: Pushing the pendulum back toward process measures. JAMA 314: 1333-1334.
  15. Mant J (2001) Process versus outcome indicators in the assessment of quality of health care. International Journal for Quality in Health Care 13: 475-480.
  16. Brook RH, McGlynn EA, Shekelle PG (2000) Defining and measuring quality of care: A perspective from US researchers. Int J Qual Health Care 12: 281-295.
  17. Zigon G, Berrino F, Gatta G, et al. (2011) Prognoses for head and neck cancers in Europe diagnosed in 1995-1999: A population-based study. Ann Oncol 22: 165-174.
  18. Gatta G, Botta L, Sanchez MJ, et al. (2015) Prognoses and improvement for head and neck cancers diagnosed in Europe in early 2000s: The EUROCARE-5 population-based study. Eur J Cancer 51: 2130-2143.
  19. Dutch Head and Neck Society (NWHHT).
  20. Jensen AR, Nellemann HM, Overgaard J (2007) Tumor progression in waiting time for radiotherapy in head and neck cancer. Radiother Oncol 84: 5-10.
  21. Williams K, Sansoni J, Morris D, et al. (2016) Patient-Reported outcome measures. Australian Commission on Safety and Quality in Health Care, Sydney.
  22. Sherman AC, Simonton S, Adams DC, et al. (2000) Assessing quality of life in patients with head and neck cancer: cross-validation of the European Organization for Research and Treatment of Cancer (EORTC) Quality of Life Head and Neck module (QLQ-H&N35). Arch Otolaryngol Head Neck Surg 126: 459-467.
  23. Ramaekers BL, Joore MA, Grutters JP, et al. (2011) The impact of late treatment-toxicity on generic health-related quality of life in head and neck cancer patients after radiotherapy. Oral Oncol 47: 768-774.
  24. Greenhalgh J, Dalkin S, Gooding K, et al. (2017) Functionality and feedback: A realist synthesis of the collation, interpretation and utilisation of patient-reported outcome measures data to improve patient care. Health Services and Delivery Research.
  25. Basch E, Deal AM, Dueck AC, et al. (2017) Overall survival results of a trial assessing patient-reported outcomes for symptom monitoring during routine cancer treatment. JAMA 318: 197-198.
  26. Breedveld F, Grol R, Hoeksem J, et al. (2012) Kwalititeitregistraties als basis voor verbeteren en vertrouwen. NFU.
  27. van Overveld LFJ, Takes RP, Vijn TW, et al. (2017) Feedback preferences of patients, professionals and health insurers in integrated head and neck cancer care. Health Expect 20: 1275-1288.
  28. Landis-Lewis Z, Brehaut JC, Hochheiser H, et al. (2015) Computer-supported feedback message tailoring: Theory-informed adaptation of clinical audit and feedback for learning and behavior change. Implement Sci 10: 12.
  29. Wright J, Dugdale B, Hammond I, et al. (2006) Learning from death: A hospital mortality reduction programme. J R Soc Med 99: 303-308.
  30. Jamtvedt G, Young JM, Kristoffersen DT, et al. (2006) Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care 15: 433-436.
  31. Jamtvedt G, Young JM, Kristoffersen DT, et al. (2006) Audit and feedback: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev.
  32. Baker R, Camosso-Stefinovic J, Gillies C, et al. (2010) Tailored interventions to overcome identified barriers to change: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev.
  33. Foy R, Eccles MP, Jamtvedt G, et al. (2005) What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review. BMC Health Serv Res 5: 50.
  34. Damman OC, De Jong A, Hibbard JH, et al. (2015) Making comparative performance information more comprehensible: An experimental evaluation of the impact of formats on consumer understanding. BMJ Qual Saf 25: 860-869.
  35. Brundage M, Feldman-Stewart D, Leis A, et al. (2005) Communicating quality of life information to cancer patients: A study of six presentation formats. J Clin Oncol 23: 6949-6956.
  36. Macdonaldross M (1977) How Numbers Are Shown - Review of Research on Presentation of Quantitative Data in Texts. Av Comm Rev 25: 359-409.
  37. Nederlandse Vereniging voor Ziekenhuizen (NVZ), KPMG (2015) Onderzoek kosten kwaliteitsmetingen. 1-32.
  38. Kringos DS, Horenberg F, Bal R, et al. (2016) Afwegingen voor de maatschappelijke relevantie vankwaliteitsregistraties. Academisch Medisch Centrum (AMC): Amsterdam.
  39. Govaert JA, van Dijk WA, Fiocco M, et al. (2016) Nationwide outcomes measurement in colorectal cancer surgery: Improving quality and reducing costs. J Am Coll Surg 222: 19-29.e2.
  40. Campbell SM, Braspenning J, Hutchinson A, et al. (2003) Research methods used in developing and applying quality indicators in primary care. BMJ 326: 816-819.
  41. Kathryn Fitch, Steven J Bernstein, Maria Dolores Aguilar, et al. (2000) The RAND/UCLA appropriateness method user's manual. Santa Monica: RAND.