MS-CANet: Multi-Scale Subtraction Network with Coordinate Attention for Retinal Vessel Segmentation

Retinal vessel segmentation is crucial in the diagnosis of certain ophthalmic and cardiovascular diseases. Although U-shaped networks have been widely used for retinal vessel segmentation, most of the improved methods have insufficient feature extraction capability and fuse different network layers...

Full description

Bibliographic Details
Main Authors: Yun Jiang, Wei Yan, Jie Chen, Hao Qiao, Zequn Zhang, Meiqi Wang
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Symmetry
Subjects:
Online Access:https://www.mdpi.com/2073-8994/15/4/835
Description
Summary:Retinal vessel segmentation is crucial in the diagnosis of certain ophthalmic and cardiovascular diseases. Although U-shaped networks have been widely used for retinal vessel segmentation, most of the improved methods have insufficient feature extraction capability and fuse different network layers using element or dimension summation, leading to redundant information and inaccurate retinal vessel localization with blurred vessel edges. The asymmetry of small blood vessels in fundus images also increases the difficulty of segmenting blood vessels. To overcome these challenges, we propose a novel multi-scale subtraction network (MS-CANet) with residual coordinate attention to segment the vessels in retinal vessel images. Our approach incorporates a residual coordinate attention module during the encoding phase, which captures long-range spatial dependencies while preserving precise position information. To obtain rich multi-scale information, we also include multi-scale subtraction units at different perceptual field levels. Moreover, we introduce a parallel channel attention module that enhances the contrast between vessel and background, thereby improving the detection of marginal vessels during the decoding phase. We validate our proposed model on three benchmark datasets, namely DRIVE, CHASE, and STARE. The results demonstrate that our method outperforms most advanced methods under different evaluation metrics.
ISSN:2073-8994