Residual learning for segmentation of the medical images in healthcare

Medical workers can assess disease progression and create expedient treatment plans with the help of automated and accurate 3Dsegmentation of medical images. DCNNs (Deep convolution neural networks) have been widely used in this work, but their accuracy still needs to be increased, mostly due to the...

Full description

Bibliographic Details
Main Authors: Jyotirmaya Sahoo, Shiv Kumar Saini, Shweta singh, Ashendra Kumar Saxena, Sachin Sharma, Aishwary Awasthi, R. Rajalakshmi
Format: Article
Language:English
Published: Elsevier 2024-04-01
Series:Measurement: Sensors
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2665917423003343
Description
Summary:Medical workers can assess disease progression and create expedient treatment plans with the help of automated and accurate 3Dsegmentation of medical images. DCNNs (Deep convolution neural networks) have been widely used in this work, but their accuracy still needs to be increased, mostly due to their insufficient understanding of 3D environments. This study proposed three dimensional residual networks, ResUNet++, for precise segmentations of three-dimensional medical images where encoders, segmentation decoders, and context residual decoders are used. Two decoders are connected at scale utilizing context attention maps and context residual, the former explicitly learns inter-slice context data and the latter utilizes contexts as attention to increase segmentation accuracy. This model was assessed by using MICCAI 2018 BraTS dataset and, the Pancreas-CT dataset. The BrasTS and Pancreas-CT dataset scales were compared in terms of ET, WT, TC. Moreover, the proposed model was compared with/without boundary loss and validation dice score. The outcomes not only show how effective the suggested 3D residual learning approach is, but also show that the suggested ResUNet++ offers better accuracy compared to six of the top-ranking techniques used for segmenting tumors in the brain.
ISSN:2665-9174