Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation
Abstract Network pruning is a significant way to improve the practicability of convolution neural networks (CNNs) by removing the redundant structure of the network model. However, in most of the existing network pruning methods l1 or l2 regularisation is applied to parameter matrices and the manual...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-02-01
|
Series: | IET Image Processing |
Subjects: | |
Online Access: | https://doi.org/10.1049/ipr2.12030 |
_version_ | 1797986080692109312 |
---|---|
author | Guo Li Gang Xu |
author_facet | Guo Li Gang Xu |
author_sort | Guo Li |
collection | DOAJ |
description | Abstract Network pruning is a significant way to improve the practicability of convolution neural networks (CNNs) by removing the redundant structure of the network model. However, in most of the existing network pruning methods l1 or l2 regularisation is applied to parameter matrices and the manual selection of pruning threshold is difficult and labor‐intensive. A novel CNNs network pruning method via l0 regularisation is proposed, which adopts l0 regularisation to expand the saliency gap between neurons. A half‐quadratic splitting (HQS) based iterative algorithm is put forward to calculate the approximation solution of l0 regularisation, which makes the joint optimisation problem of regularisation term and training loss function can be solved by various gradient‐based algorithms. Meanwhile, a hyperparameters selection method is designed to make most of the hyperparameters in the algorithm can be determined by examining the pre‐trained model. The results of experiments on MNIST, Fashion‐MNIST and CIFAR100 show that the proposed method can provide a much clearer pruning threshold by widening the saliency gap, and achieve a similar or even better compression performance, compared with the state‐of‐the‐art studies. |
first_indexed | 2024-04-11T07:28:27Z |
format | Article |
id | doaj.art-2295285fb0fd4e2da203262f43ebc5b7 |
institution | Directory Open Access Journal |
issn | 1751-9659 1751-9667 |
language | English |
last_indexed | 2024-04-11T07:28:27Z |
publishDate | 2021-02-01 |
publisher | Wiley |
record_format | Article |
series | IET Image Processing |
spelling | doaj.art-2295285fb0fd4e2da203262f43ebc5b72022-12-22T04:36:59ZengWileyIET Image Processing1751-96591751-96672021-02-0115240541810.1049/ipr2.12030Providing clear pruning threshold: A novel CNN pruning method via L0 regularisationGuo Li0Gang Xu1North China Electric Power University Beijing 102206 ChinaNorth China Electric Power University Beijing 102206 ChinaAbstract Network pruning is a significant way to improve the practicability of convolution neural networks (CNNs) by removing the redundant structure of the network model. However, in most of the existing network pruning methods l1 or l2 regularisation is applied to parameter matrices and the manual selection of pruning threshold is difficult and labor‐intensive. A novel CNNs network pruning method via l0 regularisation is proposed, which adopts l0 regularisation to expand the saliency gap between neurons. A half‐quadratic splitting (HQS) based iterative algorithm is put forward to calculate the approximation solution of l0 regularisation, which makes the joint optimisation problem of regularisation term and training loss function can be solved by various gradient‐based algorithms. Meanwhile, a hyperparameters selection method is designed to make most of the hyperparameters in the algorithm can be determined by examining the pre‐trained model. The results of experiments on MNIST, Fashion‐MNIST and CIFAR100 show that the proposed method can provide a much clearer pruning threshold by widening the saliency gap, and achieve a similar or even better compression performance, compared with the state‐of‐the‐art studies.https://doi.org/10.1049/ipr2.12030Optimisation techniquesInterpolation and function approximation (numerical analysis)AlgebraNeural nets |
spellingShingle | Guo Li Gang Xu Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation IET Image Processing Optimisation techniques Interpolation and function approximation (numerical analysis) Algebra Neural nets |
title | Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation |
title_full | Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation |
title_fullStr | Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation |
title_full_unstemmed | Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation |
title_short | Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation |
title_sort | providing clear pruning threshold a novel cnn pruning method via l0 regularisation |
topic | Optimisation techniques Interpolation and function approximation (numerical analysis) Algebra Neural nets |
url | https://doi.org/10.1049/ipr2.12030 |
work_keys_str_mv | AT guoli providingclearpruningthresholdanovelcnnpruningmethodvial0regularisation AT gangxu providingclearpruningthresholdanovelcnnpruningmethodvial0regularisation |