Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification
Alzheimer’s disease is a growing concern, and neuroimaging techniques such as Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scans are widely used to classify AD patients. While MRI captures structural information and measures brain atrophy, PET shows functional c...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10125572/ |
_version_ | 1827941088366690304 |
---|---|
author | Bouchra Guelib Karim Zarour Haithem Hermessi Bounab Rayene Khlifa Nawres |
author_facet | Bouchra Guelib Karim Zarour Haithem Hermessi Bounab Rayene Khlifa Nawres |
author_sort | Bouchra Guelib |
collection | DOAJ |
description | Alzheimer’s disease is a growing concern, and neuroimaging techniques such as Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scans are widely used to classify AD patients. While MRI captures structural information and measures brain atrophy, PET shows functional changes associated with neurological disorders, and both modalities have been proven to be AD biomarkers. However, combining MRI and PET in the same test without considering their inherent structural differences can result in a loss of important information. To address this issue, this paper proposes a novel machine learning framework for combining MRI and PET modalities and a new set of interactions known as Same-Subject-Modalities-Interactions (SSMI) to extract complementary information and new insights. The SSMI relation is derived from MRI and PET and subjected to PCA to construct the SSMI set, which is then concatenated with the other sets. The best set of features is selected and used for classification using Ridg-Classifier. Freesurfer is used to extract measures from 183 ADNI subjects (69 in the AD group and 114 in the CN group), and different classifiers are performed with train-test-split, cross-validation, and validation-set from ADNI-2/GO. The results showed high accuracy, precision, specificity, recall, F1-score, and AUC, with values of 98.94%, 98.27%, 97.10%, 100%, 99.13 and 98.55%, respectively, from ADNI and 98.75%, 98.48%, 93.75%, 100.0%, 99.23%, 96.80% from ADNI2/Go. These results are higher than those achieved by single-modality classification tasks and state-of-the-art approaches. Furthermore, the regions selected by Ridge Classifier are shown to be highly related to Alzheimer’s disease biomarkers. |
first_indexed | 2024-03-13T09:32:00Z |
format | Article |
id | doaj.art-0a0e3d7658794a2096ed8cdb1b5a7f4e |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-13T09:32:00Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-0a0e3d7658794a2096ed8cdb1b5a7f4e2023-05-25T23:00:26ZengIEEEIEEE Access2169-35362023-01-0111487154873810.1109/ACCESS.2023.327672210125572Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease ClassificationBouchra Guelib0https://orcid.org/0000-0001-7338-2423Karim Zarour1https://orcid.org/0000-0002-2727-2036Haithem Hermessi2https://orcid.org/0000-0001-6243-3272Bounab Rayene3https://orcid.org/0000-0001-9809-2617Khlifa Nawres4LIRE Laboratory, Faculty of New Technologies of Information and Communication, University of Abdelhamid Mehri Constantine 2, Constantine, AlgeriaLIRE Laboratory, Faculty of New Technologies of Information and Communication, University of Abdelhamid Mehri Constantine 2, Constantine, AlgeriaLaboratory of Informatics, Modeling and Information and Knowledge Processing (LIMTIC), Intelligent Systems in Imaging and Artificial Vision (SIIVA) Team, Higher Institute of Computer Science, University of Tunis El Manar, Ariana, TunisiaLIRE Laboratory, Faculty of New Technologies of Information and Communication, University of Abdelhamid Mehri Constantine 2, Constantine, AlgeriaResearch Laboratory of Biophysics and Medical Technologies, Higher Institute of Medical Technologies of Tunis, University of Tunis El Manar, Tunis, TunisiaAlzheimer’s disease is a growing concern, and neuroimaging techniques such as Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scans are widely used to classify AD patients. While MRI captures structural information and measures brain atrophy, PET shows functional changes associated with neurological disorders, and both modalities have been proven to be AD biomarkers. However, combining MRI and PET in the same test without considering their inherent structural differences can result in a loss of important information. To address this issue, this paper proposes a novel machine learning framework for combining MRI and PET modalities and a new set of interactions known as Same-Subject-Modalities-Interactions (SSMI) to extract complementary information and new insights. The SSMI relation is derived from MRI and PET and subjected to PCA to construct the SSMI set, which is then concatenated with the other sets. The best set of features is selected and used for classification using Ridg-Classifier. Freesurfer is used to extract measures from 183 ADNI subjects (69 in the AD group and 114 in the CN group), and different classifiers are performed with train-test-split, cross-validation, and validation-set from ADNI-2/GO. The results showed high accuracy, precision, specificity, recall, F1-score, and AUC, with values of 98.94%, 98.27%, 97.10%, 100%, 99.13 and 98.55%, respectively, from ADNI and 98.75%, 98.48%, 93.75%, 100.0%, 99.23%, 96.80% from ADNI2/Go. These results are higher than those achieved by single-modality classification tasks and state-of-the-art approaches. Furthermore, the regions selected by Ridge Classifier are shown to be highly related to Alzheimer’s disease biomarkers.https://ieeexplore.ieee.org/document/10125572/MultimodalityMRIPETfusionmachine learningfeature selection |
spellingShingle | Bouchra Guelib Karim Zarour Haithem Hermessi Bounab Rayene Khlifa Nawres Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification IEEE Access Multimodality MRI PET fusion machine learning feature selection |
title | Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification |
title_full | Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification |
title_fullStr | Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification |
title_full_unstemmed | Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification |
title_short | Same-Subject-Modalities-Interactions: A Novel Framework for MRI and PET Multi-Modality Fusion for Alzheimer’s Disease Classification |
title_sort | same subject modalities interactions a novel framework for mri and pet multi modality fusion for alzheimer x2019 s disease classification |
topic | Multimodality MRI PET fusion machine learning feature selection |
url | https://ieeexplore.ieee.org/document/10125572/ |
work_keys_str_mv | AT bouchraguelib samesubjectmodalitiesinteractionsanovelframeworkformriandpetmultimodalityfusionforalzheimerx2019sdiseaseclassification AT karimzarour samesubjectmodalitiesinteractionsanovelframeworkformriandpetmultimodalityfusionforalzheimerx2019sdiseaseclassification AT haithemhermessi samesubjectmodalitiesinteractionsanovelframeworkformriandpetmultimodalityfusionforalzheimerx2019sdiseaseclassification AT bounabrayene samesubjectmodalitiesinteractionsanovelframeworkformriandpetmultimodalityfusionforalzheimerx2019sdiseaseclassification AT khlifanawres samesubjectmodalitiesinteractionsanovelframeworkformriandpetmultimodalityfusionforalzheimerx2019sdiseaseclassification |