Interpretable machine learning model to predict survival days of malignant brain tumor patients
An artificial intelligence (AI) model’s performance is strongly influenced by the input features. Therefore, it is vital to find the optimal feature set. It is more crucial for the survival prediction of the glioblastoma multiforme (GBM) type of brain tumor. In this study, we identify the best featu...
Main Authors: | Snehal Rajput, Rupal A Kapdi, Mehul S Raval, Mohendra Roy |
---|---|
Format: | Article |
Language: | English |
Published: |
IOP Publishing
2023-01-01
|
Series: | Machine Learning: Science and Technology |
Subjects: | |
Online Access: | https://doi.org/10.1088/2632-2153/acd5a9 |
Similar Items
-
A triplanar ensemble model for brain tumor segmentation with volumetric multiparametric magnetic resonance images
by: Snehal Rajput, et al.
Published: (2024-06-01) -
Evaluating perceptual and semantic interpretability of saliency methods: A case study of melanoma
by: Harshit Bokadia, et al.
Published: (2022-09-01) -
Exploring Evaluation Methods for Interpretable Machine Learning: A Survey
by: Nourah Alangari, et al.
Published: (2023-08-01) -
Effects of Class Imbalance Countermeasures on Interpretability
by: David Cemernek, et al.
Published: (2024-01-01) -
Intrinsically Interpretable Gaussian Mixture Model
by: Nourah Alangari, et al.
Published: (2023-03-01)