Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN

Upper gastrointestinal endoscopy is widely performed to detect early gastric cancers. As an automated detection method for early gastric cancer from endoscopic images, a method involving an object detection model, which is a deep learning technique, was proposed. However, there were challenges regar...

Full description

Bibliographic Details
Main Authors: Atsushi Teramoto, Tomoyuki Shibata, Hyuga Yamada, Yoshiki Hirooka, Kuniaki Saito, Hiroshi Fujita
Format: Article
Language:English
Published: MDPI AG 2021-11-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/11/23/11275
_version_ 1797508101303173120
author Atsushi Teramoto
Tomoyuki Shibata
Hyuga Yamada
Yoshiki Hirooka
Kuniaki Saito
Hiroshi Fujita
author_facet Atsushi Teramoto
Tomoyuki Shibata
Hyuga Yamada
Yoshiki Hirooka
Kuniaki Saito
Hiroshi Fujita
author_sort Atsushi Teramoto
collection DOAJ
description Upper gastrointestinal endoscopy is widely performed to detect early gastric cancers. As an automated detection method for early gastric cancer from endoscopic images, a method involving an object detection model, which is a deep learning technique, was proposed. However, there were challenges regarding the reduction in false positives in the detected results. In this study, we proposed a novel object detection model, U-Net R-CNN, based on a semantic segmentation technique that extracts target objects by performing a local analysis of the images. U-Net was introduced as a semantic segmentation method to detect early candidates for gastric cancer. These candidates were classified as gastric cancer cases or false positives based on box classification using a convolutional neural network. In the experiments, the detection performance was evaluated via the 5-fold cross-validation method using 1208 images of healthy subjects and 533 images of gastric cancer patients. When DenseNet169 was used as the convolutional neural network for box classification, the detection sensitivity and the number of false positives evaluated on a lesion basis were 98% and 0.01 per image, respectively, which improved the detection performance compared to the previous method. These results indicate that the proposed method will be useful for the automated detection of early gastric cancer from endoscopic images.
first_indexed 2024-03-10T04:57:33Z
format Article
id doaj.art-b77b461b77e44aec9d31ad888c47afe6
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-03-10T04:57:33Z
publishDate 2021-11-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-b77b461b77e44aec9d31ad888c47afe62023-11-23T02:05:26ZengMDPI AGApplied Sciences2076-34172021-11-0111231127510.3390/app112311275Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNNAtsushi Teramoto0Tomoyuki Shibata1Hyuga Yamada2Yoshiki Hirooka3Kuniaki Saito4Hiroshi Fujita5School of Medical Sciences, Fujita Health University, Toyoake 470-1192, JapanDepartment of Gastroenterology and Hepatology, Fujita Health University, Toyoake 470-1192, JapanDepartment of Gastroenterology and Hepatology, Fujita Health University, Toyoake 470-1192, JapanDepartment of Gastroenterology and Hepatology, Fujita Health University, Toyoake 470-1192, JapanSchool of Medical Sciences, Fujita Health University, Toyoake 470-1192, JapanFaculty of Engineering, Gifu University, Gifu 501-1194, JapanUpper gastrointestinal endoscopy is widely performed to detect early gastric cancers. As an automated detection method for early gastric cancer from endoscopic images, a method involving an object detection model, which is a deep learning technique, was proposed. However, there were challenges regarding the reduction in false positives in the detected results. In this study, we proposed a novel object detection model, U-Net R-CNN, based on a semantic segmentation technique that extracts target objects by performing a local analysis of the images. U-Net was introduced as a semantic segmentation method to detect early candidates for gastric cancer. These candidates were classified as gastric cancer cases or false positives based on box classification using a convolutional neural network. In the experiments, the detection performance was evaluated via the 5-fold cross-validation method using 1208 images of healthy subjects and 533 images of gastric cancer patients. When DenseNet169 was used as the convolutional neural network for box classification, the detection sensitivity and the number of false positives evaluated on a lesion basis were 98% and 0.01 per image, respectively, which improved the detection performance compared to the previous method. These results indicate that the proposed method will be useful for the automated detection of early gastric cancer from endoscopic images.https://www.mdpi.com/2076-3417/11/23/11275gastric cancerendoscopydeep learningconvolutional neural network
spellingShingle Atsushi Teramoto
Tomoyuki Shibata
Hyuga Yamada
Yoshiki Hirooka
Kuniaki Saito
Hiroshi Fujita
Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
Applied Sciences
gastric cancer
endoscopy
deep learning
convolutional neural network
title Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
title_full Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
title_fullStr Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
title_full_unstemmed Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
title_short Automated Detection of Gastric Cancer by Retrospective Endoscopic Image Dataset Using U-Net R-CNN
title_sort automated detection of gastric cancer by retrospective endoscopic image dataset using u net r cnn
topic gastric cancer
endoscopy
deep learning
convolutional neural network
url https://www.mdpi.com/2076-3417/11/23/11275
work_keys_str_mv AT atsushiteramoto automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn
AT tomoyukishibata automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn
AT hyugayamada automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn
AT yoshikihirooka automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn
AT kuniakisaito automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn
AT hiroshifujita automateddetectionofgastriccancerbyretrospectiveendoscopicimagedatasetusingunetrcnn