Gated Skip-Connection Network with Adaptive Upsampling for Retinal Vessel Segmentation

Segmentation of retinal vessels is a critical step for the diagnosis of some fundus diseases. <i><b>Methods:</b></i> To further enhance the performance of vessel segmentation, we propose a method based on a gated skip-connection network with adaptive upsampling (GSAU-Net). In...

Full description

Bibliographic Details
Main Authors: Yun Jiang, Huixia Yao, Shengxin Tao, Jing Liang
Format: Article
Language:English
Published: MDPI AG 2021-09-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/21/18/6177
Description
Summary:Segmentation of retinal vessels is a critical step for the diagnosis of some fundus diseases. <i><b>Methods:</b></i> To further enhance the performance of vessel segmentation, we propose a method based on a gated skip-connection network with adaptive upsampling (GSAU-Net). In GSAU-Net, a novel skip-connection with gating is first utilized in the extension path, which facilitates the flow of information from the encoder to the decoder. Specifically, we used the gated skip-connection between the encoder and decoder to gate the lower-level information from the encoder. In the decoding phase, we used an adaptive upsampling to replace the bilinear interpolation, which recovers feature maps from the decoder to obtain the pixelwise prediction. Finally, we validated our method on the DRIVE, CHASE, and STARE datasets. <i><b>Results:</b></i> The experimental results showed that our proposed method outperformed some existing methods, such as DeepVessel, AG-Net, and IterNet, in terms of accuracy, F-measure, and AUC<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><msub><mrow></mrow><mi>R</mi></msub><msub><mrow></mrow><mi>O</mi></msub><msub><mrow></mrow><mi>C</mi></msub></mrow></semantics></math></inline-formula>. The proposed method achieved a vessel segmentation F-measure of 83.13%, 81.40%, and 84.84% on the DRIVE, CHASE, and STARE datasets, respectively.
ISSN:1424-8220