Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction
Recent improvements in medical image analysis using deep learning-based neural networks can potentially be exploited to enhance the performance of computer-aided detection/diagnosis systems. In this study, we propose a feature space transfer mode (FSTM) for learning the phenotype relationships betwe...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9490249/ |
_version_ | 1818456293163139072 |
---|---|
author | Masahiro Ogino Zisheng Li Akinobu Shimizu |
author_facet | Masahiro Ogino Zisheng Li Akinobu Shimizu |
author_sort | Masahiro Ogino |
collection | DOAJ |
description | Recent improvements in medical image analysis using deep learning-based neural networks can potentially be exploited to enhance the performance of computer-aided detection/diagnosis systems. In this study, we propose a feature space transfer mode (FSTM) for learning the phenotype relationships between radiological images and pathological images. We hypothesize that high-level features from the same patient can be linked between different modality images with different resolutions. We refer to our method as “augmented radiology” because the inference model only requires radiological images as input while the prediction result can be linked to specific pathological phenotypes. We applied the proposed method to the pathological tumor classification (T0 vs. T2c/T3a and T0 vs. T2c vs. T3a) of prostate cancer and found that it achieved a high classification accuracy (0.880 for T0 vs. T2c/T3a and 0.825 for T0 vs. T2c vs. T3a) given only the radiological images as input. We also analyzed the validity of the proposed method by visualizing the transferred features and found that it can extract useful information for diagnosis embedded in radiological images. We conclude that the proposed method will significantly help improve the diagnostic prediction performance of radiological images. |
first_indexed | 2024-12-14T22:24:22Z |
format | Article |
id | doaj.art-af60b498b40d455fbce8d3944da18a45 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-14T22:24:22Z |
publishDate | 2021-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-af60b498b40d455fbce8d3944da18a452022-12-21T22:45:25ZengIEEEIEEE Access2169-35362021-01-01910255910256610.1109/ACCESS.2021.30980389490249Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage PredictionMasahiro Ogino0https://orcid.org/0000-0002-1812-2431Zisheng Li1Akinobu Shimizu2Research and Development Group, Hitachi, Ltd., Tokyo, JapanResearch and Development Group, Hitachi, Ltd., Tokyo, JapanInstitute of Engineering, Tokyo University of Agriculture and Technology, Tokyo, Koganei, JapanRecent improvements in medical image analysis using deep learning-based neural networks can potentially be exploited to enhance the performance of computer-aided detection/diagnosis systems. In this study, we propose a feature space transfer mode (FSTM) for learning the phenotype relationships between radiological images and pathological images. We hypothesize that high-level features from the same patient can be linked between different modality images with different resolutions. We refer to our method as “augmented radiology” because the inference model only requires radiological images as input while the prediction result can be linked to specific pathological phenotypes. We applied the proposed method to the pathological tumor classification (T0 vs. T2c/T3a and T0 vs. T2c vs. T3a) of prostate cancer and found that it achieved a high classification accuracy (0.880 for T0 vs. T2c/T3a and 0.825 for T0 vs. T2c vs. T3a) given only the radiological images as input. We also analyzed the validity of the proposed method by visualizing the transferred features and found that it can extract useful information for diagnosis embedded in radiological images. We conclude that the proposed method will significantly help improve the diagnostic prediction performance of radiological images.https://ieeexplore.ieee.org/document/9490249/Feature transferdeep learningAI-CADtumor classificationcancer T-stage prediction |
spellingShingle | Masahiro Ogino Zisheng Li Akinobu Shimizu Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction IEEE Access Feature transfer deep learning AI-CAD tumor classification cancer T-stage prediction |
title | Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction |
title_full | Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction |
title_fullStr | Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction |
title_full_unstemmed | Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction |
title_short | Augmented Radiology: Feature Space Transfer Model for Prostate Cancer Stage Prediction |
title_sort | augmented radiology feature space transfer model for prostate cancer stage prediction |
topic | Feature transfer deep learning AI-CAD tumor classification cancer T-stage prediction |
url | https://ieeexplore.ieee.org/document/9490249/ |
work_keys_str_mv | AT masahiroogino augmentedradiologyfeaturespacetransfermodelforprostatecancerstageprediction AT zishengli augmentedradiologyfeaturespacetransfermodelforprostatecancerstageprediction AT akinobushimizu augmentedradiologyfeaturespacetransfermodelforprostatecancerstageprediction |