A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions

<b>Background:</b> It is known that oral diseases such as periodontal (gum) disease are closely linked to various systemic diseases and disorders. Deep learning advances have the potential to make major contributions to healthcare, particularly in the domains that rely on medical imaging...

Full description

Bibliographic Details
Main Authors: Dan Zhao, Morteza Homayounfar, Zhe Zhen, Mei-Zhen Wu, Shuk Yin Yu, Kai-Hang Yiu, Varut Vardhanabhuti, George Pelekos, Lijian Jin, Mohamad Koohi-Moghadam
Format: Article
Language:English
Published: MDPI AG 2022-12-01
Series:Diagnostics
Subjects:
Online Access:https://www.mdpi.com/2075-4418/12/12/3192
_version_ 1797460575948636160
author Dan Zhao
Morteza Homayounfar
Zhe Zhen
Mei-Zhen Wu
Shuk Yin Yu
Kai-Hang Yiu
Varut Vardhanabhuti
George Pelekos
Lijian Jin
Mohamad Koohi-Moghadam
author_facet Dan Zhao
Morteza Homayounfar
Zhe Zhen
Mei-Zhen Wu
Shuk Yin Yu
Kai-Hang Yiu
Varut Vardhanabhuti
George Pelekos
Lijian Jin
Mohamad Koohi-Moghadam
author_sort Dan Zhao
collection DOAJ
description <b>Background:</b> It is known that oral diseases such as periodontal (gum) disease are closely linked to various systemic diseases and disorders. Deep learning advances have the potential to make major contributions to healthcare, particularly in the domains that rely on medical imaging. Incorporating non-imaging information based on clinical and laboratory data may allow clinicians to make more comprehensive and accurate decisions. <b>Methods:</b> Here, we developed a multimodal deep learning method to predict systemic diseases and disorders from oral health conditions. A dual-loss autoencoder was used in the first phase to extract periodontal disease-related features from 1188 panoramic radiographs. Then, in the second phase, we fused the image features with the demographic data and clinical information taken from electronic health records (EHR) to predict systemic diseases. We used receiver operation characteristics (ROC) and accuracy to evaluate our model. The model was further validated by an unseen test dataset. <b>Findings:</b> According to our findings, the top three most accurately predicted chapters, in order, are the Chapters III, VI and IX. The results indicated that the proposed model could predict systemic diseases belonging to Chapters III, VI and IX, with AUC values of 0.92 (95% CI, 0.90–94), 0.87 (95% CI, 0.84–89) and 0.78 (95% CI, 0.75–81), respectively. To assess the robustness of the models, we performed the evaluation on the unseen test dataset for these chapters and the results showed an accuracy of 0.88, 0.82 and 0.72 for Chapters III, VI and IX, respectively. <b>Interpretation:</b> The present study shows that the combination of panoramic radiograph and clinical oral features could be considered to train a fusion deep learning model for predicting systemic diseases and disorders.
first_indexed 2024-03-09T17:07:59Z
format Article
id doaj.art-01cbaf69b9744b528a327702340274af
institution Directory Open Access Journal
issn 2075-4418
language English
last_indexed 2024-03-09T17:07:59Z
publishDate 2022-12-01
publisher MDPI AG
record_format Article
series Diagnostics
spelling doaj.art-01cbaf69b9744b528a327702340274af2023-11-24T14:20:20ZengMDPI AGDiagnostics2075-44182022-12-011212319210.3390/diagnostics12123192A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral ConditionsDan Zhao0Morteza Homayounfar1Zhe Zhen2Mei-Zhen Wu3Shuk Yin Yu4Kai-Hang Yiu5Varut Vardhanabhuti6George Pelekos7Lijian Jin8Mohamad Koohi-Moghadam9Division of Periodontology & Implant Dentistry, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, ChinaDivision of Applied Oral Sciences & Community Dental Care, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, ChinaDivision of Cardiology, Department of Medicine, The University of Hong Kong-Shenzhen Hospital, Shenzhen 518009, ChinaDivision of Cardiology, Department of Medicine, The University of Hong Kong-Shenzhen Hospital, Shenzhen 518009, ChinaDivision of Cardiology, Department of Medicine, The University of Hong Kong-Shenzhen Hospital, Shenzhen 518009, ChinaDivision of Cardiology, Department of Medicine, The University of Hong Kong-Shenzhen Hospital, Shenzhen 518009, ChinaDepartment of Diagnostic Radiology, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Hong Kong SAR, ChinaDivision of Periodontology & Implant Dentistry, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, ChinaDivision of Periodontology & Implant Dentistry, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, ChinaDivision of Applied Oral Sciences & Community Dental Care, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China<b>Background:</b> It is known that oral diseases such as periodontal (gum) disease are closely linked to various systemic diseases and disorders. Deep learning advances have the potential to make major contributions to healthcare, particularly in the domains that rely on medical imaging. Incorporating non-imaging information based on clinical and laboratory data may allow clinicians to make more comprehensive and accurate decisions. <b>Methods:</b> Here, we developed a multimodal deep learning method to predict systemic diseases and disorders from oral health conditions. A dual-loss autoencoder was used in the first phase to extract periodontal disease-related features from 1188 panoramic radiographs. Then, in the second phase, we fused the image features with the demographic data and clinical information taken from electronic health records (EHR) to predict systemic diseases. We used receiver operation characteristics (ROC) and accuracy to evaluate our model. The model was further validated by an unseen test dataset. <b>Findings:</b> According to our findings, the top three most accurately predicted chapters, in order, are the Chapters III, VI and IX. The results indicated that the proposed model could predict systemic diseases belonging to Chapters III, VI and IX, with AUC values of 0.92 (95% CI, 0.90–94), 0.87 (95% CI, 0.84–89) and 0.78 (95% CI, 0.75–81), respectively. To assess the robustness of the models, we performed the evaluation on the unseen test dataset for these chapters and the results showed an accuracy of 0.88, 0.82 and 0.72 for Chapters III, VI and IX, respectively. <b>Interpretation:</b> The present study shows that the combination of panoramic radiograph and clinical oral features could be considered to train a fusion deep learning model for predicting systemic diseases and disorders.https://www.mdpi.com/2075-4418/12/12/3192periodontal diseasesystemic comorbiditymultimodal deep learningpanoramic radiographelectronic health records
spellingShingle Dan Zhao
Morteza Homayounfar
Zhe Zhen
Mei-Zhen Wu
Shuk Yin Yu
Kai-Hang Yiu
Varut Vardhanabhuti
George Pelekos
Lijian Jin
Mohamad Koohi-Moghadam
A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
Diagnostics
periodontal disease
systemic comorbidity
multimodal deep learning
panoramic radiograph
electronic health records
title A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
title_full A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
title_fullStr A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
title_full_unstemmed A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
title_short A Multimodal Deep Learning Approach to Predicting Systemic Diseases from Oral Conditions
title_sort multimodal deep learning approach to predicting systemic diseases from oral conditions
topic periodontal disease
systemic comorbidity
multimodal deep learning
panoramic radiograph
electronic health records
url https://www.mdpi.com/2075-4418/12/12/3192
work_keys_str_mv AT danzhao amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT mortezahomayounfar amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT zhezhen amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT meizhenwu amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT shukyinyu amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT kaihangyiu amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT varutvardhanabhuti amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT georgepelekos amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT lijianjin amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT mohamadkoohimoghadam amultimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT danzhao multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT mortezahomayounfar multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT zhezhen multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT meizhenwu multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT shukyinyu multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT kaihangyiu multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT varutvardhanabhuti multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT georgepelekos multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT lijianjin multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions
AT mohamadkoohimoghadam multimodaldeeplearningapproachtopredictingsystemicdiseasesfromoralconditions