Evaluation of the Shapley Additive Explanation Technique for Ensemble Learning Methods

This study aims to explore the effectiveness of the Shapley additive explanation (SHAP) technique in developing a transparent, interpretable, and explainable ensemble method for heart disease diagnosis using random forest algorithms. Firstly, the features with high impact on the heart disease predi...

Full description

Bibliographic Details
Main Author: Tsehay Admassu Assegie
Format: Article
Language:English
Published: Taiwan Association of Engineering and Technology Innovation 2022-04-01
Series:Proceedings of Engineering and Technology Innovation
Subjects:
Online Access:https://ojs.imeti.org/index.php/PETI/article/view/9025
Description
Summary:This study aims to explore the effectiveness of the Shapley additive explanation (SHAP) technique in developing a transparent, interpretable, and explainable ensemble method for heart disease diagnosis using random forest algorithms. Firstly, the features with high impact on the heart disease prediction are selected by SHAP using 1025 heart disease datasets, obtained from a publicly available Kaggle data repository. After that, the features which have the greatest influence on the heart disease prediction are used to develop an interpretable ensemble learning model to automate the heart disease diagnosis by employing the SHAP technique. Finally, the performance of the developed model is evaluated. The SHAP values are used to obtain better performance of heart disease diagnosis. The experimental result shows that 100% prediction accuracy is achieved with the developed model. In addition, the experiment shows that age, chest pain, and maximum heart rate have positive impact on the prediction outcome.
ISSN:2413-7146
2518-833X