Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer
Machine learning (ML) models have proven to be an attractive alternative to traditional statistical methods in oncology. However, they are often regarded as <i>black boxes</i>, hindering their adoption for answering real-life clinical questions. In this paper, we show a practical applica...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-01-01
|
Series: | Algorithms |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-4893/15/2/49 |
_version_ | 1827657776264904704 |
---|---|
author | Femke M. Janssen Katja K. H. Aben Berdine L. Heesterman Quirinus J. M. Voorham Paul A. Seegers Arturo Moncada-Torres |
author_facet | Femke M. Janssen Katja K. H. Aben Berdine L. Heesterman Quirinus J. M. Voorham Paul A. Seegers Arturo Moncada-Torres |
author_sort | Femke M. Janssen |
collection | DOAJ |
description | Machine learning (ML) models have proven to be an attractive alternative to traditional statistical methods in oncology. However, they are often regarded as <i>black boxes</i>, hindering their adoption for answering real-life clinical questions. In this paper, we show a practical application of explainable machine learning (XML). Specifically, we explored the effect that synoptic reporting (SR; i.e., reports where data elements are presented as discrete data items) in Pathology has on the survival of a population of 14,878 Dutch prostate cancer patients. We compared the performance of a Cox Proportional Hazards model (CPH) against that of an eXtreme Gradient Boosting model (XGB) in predicting patient ranked survival. We found that the XGB model (<i>c</i>-index = 0.67) performed significantly better than the CPH (<i>c</i>-index = 0.58). Moreover, we used Shapley Additive Explanations (SHAP) values to generate a quantitative mathematical representation of how features—including usage of SR—contributed to the models’ output. The XGB model in combination with SHAP visualizations revealed interesting interaction effects between SR and the rest of the most important features. These results hint that SR has a moderate positive impact on predicted patient survival. Moreover, adding an explainability layer to predictive ML models can open their <i>black box</i>, making them more accessible and easier to understand by the user. This can make XML-based techniques appealing alternatives to the classical methods used in oncological research and in health care in general. |
first_indexed | 2024-03-09T22:48:44Z |
format | Article |
id | doaj.art-c2ba8cd712084cd783321849967bcf8b |
institution | Directory Open Access Journal |
issn | 1999-4893 |
language | English |
last_indexed | 2024-03-09T22:48:44Z |
publishDate | 2022-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Algorithms |
spelling | doaj.art-c2ba8cd712084cd783321849967bcf8b2023-11-23T18:24:12ZengMDPI AGAlgorithms1999-48932022-01-011524910.3390/a15020049Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate CancerFemke M. Janssen0Katja K. H. Aben1Berdine L. Heesterman2Quirinus J. M. Voorham3Paul A. Seegers4Arturo Moncada-Torres5The Netherlands Comprehensive Cancer Organization (IKNL), 5612 HZ Eindhoven, The NetherlandsThe Netherlands Comprehensive Cancer Organization (IKNL), 5612 HZ Eindhoven, The NetherlandsThe Netherlands Comprehensive Cancer Organization (IKNL), 5612 HZ Eindhoven, The NetherlandsNationwide Network and Registry of Histo- and Cytopathology in The Netherlands (PALGA), 1066 CX Amsterdam, The NetherlandsNationwide Network and Registry of Histo- and Cytopathology in The Netherlands (PALGA), 1066 CX Amsterdam, The NetherlandsThe Netherlands Comprehensive Cancer Organization (IKNL), 5612 HZ Eindhoven, The NetherlandsMachine learning (ML) models have proven to be an attractive alternative to traditional statistical methods in oncology. However, they are often regarded as <i>black boxes</i>, hindering their adoption for answering real-life clinical questions. In this paper, we show a practical application of explainable machine learning (XML). Specifically, we explored the effect that synoptic reporting (SR; i.e., reports where data elements are presented as discrete data items) in Pathology has on the survival of a population of 14,878 Dutch prostate cancer patients. We compared the performance of a Cox Proportional Hazards model (CPH) against that of an eXtreme Gradient Boosting model (XGB) in predicting patient ranked survival. We found that the XGB model (<i>c</i>-index = 0.67) performed significantly better than the CPH (<i>c</i>-index = 0.58). Moreover, we used Shapley Additive Explanations (SHAP) values to generate a quantitative mathematical representation of how features—including usage of SR—contributed to the models’ output. The XGB model in combination with SHAP visualizations revealed interesting interaction effects between SR and the rest of the most important features. These results hint that SR has a moderate positive impact on predicted patient survival. Moreover, adding an explainability layer to predictive ML models can open their <i>black box</i>, making them more accessible and easier to understand by the user. This can make XML-based techniques appealing alternatives to the classical methods used in oncological research and in health care in general.https://www.mdpi.com/1999-4893/15/2/49Cox Proportional Hazards (CPH)explainable AIeXtreme Gradient Boosting (XGB)interpretabilityoncologyprostatectomy |
spellingShingle | Femke M. Janssen Katja K. H. Aben Berdine L. Heesterman Quirinus J. M. Voorham Paul A. Seegers Arturo Moncada-Torres Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer Algorithms Cox Proportional Hazards (CPH) explainable AI eXtreme Gradient Boosting (XGB) interpretability oncology prostatectomy |
title | Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer |
title_full | Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer |
title_fullStr | Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer |
title_full_unstemmed | Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer |
title_short | Using Explainable Machine Learning to Explore the Impact of Synoptic Reporting on Prostate Cancer |
title_sort | using explainable machine learning to explore the impact of synoptic reporting on prostate cancer |
topic | Cox Proportional Hazards (CPH) explainable AI eXtreme Gradient Boosting (XGB) interpretability oncology prostatectomy |
url | https://www.mdpi.com/1999-4893/15/2/49 |
work_keys_str_mv | AT femkemjanssen usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer AT katjakhaben usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer AT berdinelheesterman usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer AT quirinusjmvoorham usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer AT paulaseegers usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer AT arturomoncadatorres usingexplainablemachinelearningtoexploretheimpactofsynopticreportingonprostatecancer |