BSDNet: Balanced Sample Distribution Network for Real-Time Semantic Segmentation of Road Scenes

In recent years, semantic segmentation based on deep convolutional neural networks has developed rapidly. However, it is still a challenge to balance the computing cost and segmentation performance for the current semantic segmentation methods. This paper proposes a lightweight real-time semantic se...

Full description

Bibliographic Details
Main Authors: Lv Ye, Jianxu Zeng, Yue Yang, Ashara Emmanuel Chimaobi, Nyaradzo Mercy Sekenya
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9448235/
Description
Summary:In recent years, semantic segmentation based on deep convolutional neural networks has developed rapidly. However, it is still a challenge to balance the computing cost and segmentation performance for the current semantic segmentation methods. This paper proposes a lightweight real-time semantic segmentation model Balanced Sample Distribution Network (BSDNet), to solve this problem. In BSDNet, we introduce the Balanced Sample Distribution Module (BSDModule) to balance the sampling distribution of convolution and obtain features with a larger receptive field. To optimize the segmentation effect, we introduce a Shuffle Channel Attention Module (SCAModule) to enhance the interaction of channel features at the cost of a small number of parameters. BSDModule and SCAModule are lightweight and flexible and can adapt to different types of network structures. Extensive experiments on CityScapes and CamVid show that the proposed method can balance the computing cost and segmentation performance. Specifically, on the CityScapes with <inline-formula> <tex-math notation="LaTeX">$512\times1024$ </tex-math></inline-formula> resolution, BSDNet-Xception39 achieves 68.3&#x0025; MIoU and 84.6 FPS with only 1.2M parameters.
ISSN:2169-3536