CAP : Context-aware Pruning for semantic segmentation

Network pruning for deep convolutional neural networks (CNNs) has recently achieved notable research progress on image-level classification. However, most existing pruning methods are not catered to or evaluated on semantic segmentation networks. In this paper, we advocate the importance of contextu...

Full description

Bibliographic Details
Main Authors: He, Wei, Wu, Meiqing, Liang, Mingfu, Lam, Siew-Kei
Other Authors: School of Computer Science and Engineering
Format: Conference Paper
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/147439
_version_ 1826117215781388288
author He, Wei
Wu, Meiqing
Liang, Mingfu
Lam, Siew-Kei
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
He, Wei
Wu, Meiqing
Liang, Mingfu
Lam, Siew-Kei
author_sort He, Wei
collection NTU
description Network pruning for deep convolutional neural networks (CNNs) has recently achieved notable research progress on image-level classification. However, most existing pruning methods are not catered to or evaluated on semantic segmentation networks. In this paper, we advocate the importance of contextual information during channel pruning for semantic segmentation networks by presenting a novel Context-aware Pruning framework. Concretely, we formulate the embedded contextual information by leveraging the layer-wise channels interdependency via the Context-aware Guiding Module (CAGM) and introduce the Context-aware Guided Sparsification (CAGS) to adaptively identify the informative channels on the cumbersome model by inducing channel-wise sparsity on the scaling factors in batch normalization (BN) layers. The resulting pruned models require significantly lesser operations for inference while maintaining comparable performance to (at times outperforming) the original models. We evaluated our framework on widely-used benchmarks and showed its effectiveness on both large and lightweight models. On Cityscapes dataset, our framework reduces the number of parameters by 32%, 47%, 54%, and 63%, on PSPNet101, PSPNet50, ICNet, and SegNet, respectively, while preserving the performance.
first_indexed 2024-10-01T04:23:40Z
format Conference Paper
id ntu-10356/147439
institution Nanyang Technological University
language English
last_indexed 2024-10-01T04:23:40Z
publishDate 2021
record_format dspace
spelling ntu-10356/1474392024-06-26T00:52:39Z CAP : Context-aware Pruning for semantic segmentation He, Wei Wu, Meiqing Liang, Mingfu Lam, Siew-Kei School of Computer Science and Engineering 2021 IEEE Winter Conference on Applications of Computer Vision (WACV) Hardware & Embedded Systems Lab (HESL) Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Network Compression Semantic Segmentation Network pruning for deep convolutional neural networks (CNNs) has recently achieved notable research progress on image-level classification. However, most existing pruning methods are not catered to or evaluated on semantic segmentation networks. In this paper, we advocate the importance of contextual information during channel pruning for semantic segmentation networks by presenting a novel Context-aware Pruning framework. Concretely, we formulate the embedded contextual information by leveraging the layer-wise channels interdependency via the Context-aware Guiding Module (CAGM) and introduce the Context-aware Guided Sparsification (CAGS) to adaptively identify the informative channels on the cumbersome model by inducing channel-wise sparsity on the scaling factors in batch normalization (BN) layers. The resulting pruned models require significantly lesser operations for inference while maintaining comparable performance to (at times outperforming) the original models. We evaluated our framework on widely-used benchmarks and showed its effectiveness on both large and lightweight models. On Cityscapes dataset, our framework reduces the number of parameters by 32%, 47%, 54%, and 63%, on PSPNet101, PSPNet50, ICNet, and SegNet, respectively, while preserving the performance. National Research Foundation (NRF) Published version This research project is supported in part by the National Research Foundation Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme with the Technical University of Munich at TUMCREATE. 2021-04-13T00:53:37Z 2021-04-13T00:53:37Z 2021 Conference Paper He, W., Wu, M., Liang, M. & Lam, S. (2021). CAP : Context-aware Pruning for semantic segmentation. 2021 IEEE Winter Conference on Applications of Computer Vision (WACV), 960-969. https://hdl.handle.net/10356/147439 960 969 en © 2021 The Author(s) (published by IEEE). This is an open-access article distributed under the terms of the Creative Commons Attribution License. application/pdf application/pdf
spellingShingle Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Network Compression
Semantic Segmentation
He, Wei
Wu, Meiqing
Liang, Mingfu
Lam, Siew-Kei
CAP : Context-aware Pruning for semantic segmentation
title CAP : Context-aware Pruning for semantic segmentation
title_full CAP : Context-aware Pruning for semantic segmentation
title_fullStr CAP : Context-aware Pruning for semantic segmentation
title_full_unstemmed CAP : Context-aware Pruning for semantic segmentation
title_short CAP : Context-aware Pruning for semantic segmentation
title_sort cap context aware pruning for semantic segmentation
topic Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Network Compression
Semantic Segmentation
url https://hdl.handle.net/10356/147439
work_keys_str_mv AT hewei capcontextawarepruningforsemanticsegmentation
AT wumeiqing capcontextawarepruningforsemanticsegmentation
AT liangmingfu capcontextawarepruningforsemanticsegmentation
AT lamsiewkei capcontextawarepruningforsemanticsegmentation