A signal propagation perspective for pruning neural networks at initialization

Network pruning is a promising avenue for compressing deep neural networks. A typical approach to pruning starts by training a model and then removing redundant parameters while minimizing the impact on what is learned. Alternatively, a recent approach shows that pruning can be done at initializatio...

Full description

Bibliographic Details
Main Authors: Lee, N, Ajanthan, T, Gould, S, Torr, PHS
Format: Conference item
Language:English
Published: International Conference on Learning Representations 2019
_version_ 1797089824382910464
author Lee, N
Ajanthan, T
Gould, S
Torr, PHS
author_facet Lee, N
Ajanthan, T
Gould, S
Torr, PHS
author_sort Lee, N
collection OXFORD
description Network pruning is a promising avenue for compressing deep neural networks. A typical approach to pruning starts by training a model and then removing redundant parameters while minimizing the impact on what is learned. Alternatively, a recent approach shows that pruning can be done at initialization prior to training, based on a saliency criterion called connection sensitivity. However, it remains unclear exactly why pruning an untrained, randomly initialized neural network is effective. In this work, by noting connection sensitivity as a form of gradient, we formally characterize initialization conditions to ensure reliable connection sensitivity measurements, which in turn yields effective pruning results. Moreover, we analyze the signal propagation properties of the resulting pruned networks and introduce a simple, data-free method to improve their trainability. Our modifications to the existing pruning at initialization method lead to improved results on all tested network models for image classification tasks. Furthermore, we empirically study the effect of supervision for pruning and demonstrate that our signal propagation perspective, combined with unsupervised pruning, can be useful in various scenarios where pruning is applied to non-standard arbitrarily-designed architectures.
first_indexed 2024-03-07T03:09:39Z
format Conference item
id oxford-uuid:b3bf2162-b58b-4c32-bfa5-66bbe9b17563
institution University of Oxford
language English
last_indexed 2024-03-07T03:09:39Z
publishDate 2019
publisher International Conference on Learning Representations
record_format dspace
spelling oxford-uuid:b3bf2162-b58b-4c32-bfa5-66bbe9b175632022-03-27T04:21:31ZA signal propagation perspective for pruning neural networks at initializationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:b3bf2162-b58b-4c32-bfa5-66bbe9b17563EnglishSymplectic ElementsInternational Conference on Learning Representations2019Lee, NAjanthan, TGould, STorr, PHSNetwork pruning is a promising avenue for compressing deep neural networks. A typical approach to pruning starts by training a model and then removing redundant parameters while minimizing the impact on what is learned. Alternatively, a recent approach shows that pruning can be done at initialization prior to training, based on a saliency criterion called connection sensitivity. However, it remains unclear exactly why pruning an untrained, randomly initialized neural network is effective. In this work, by noting connection sensitivity as a form of gradient, we formally characterize initialization conditions to ensure reliable connection sensitivity measurements, which in turn yields effective pruning results. Moreover, we analyze the signal propagation properties of the resulting pruned networks and introduce a simple, data-free method to improve their trainability. Our modifications to the existing pruning at initialization method lead to improved results on all tested network models for image classification tasks. Furthermore, we empirically study the effect of supervision for pruning and demonstrate that our signal propagation perspective, combined with unsupervised pruning, can be useful in various scenarios where pruning is applied to non-standard arbitrarily-designed architectures.
spellingShingle Lee, N
Ajanthan, T
Gould, S
Torr, PHS
A signal propagation perspective for pruning neural networks at initialization
title A signal propagation perspective for pruning neural networks at initialization
title_full A signal propagation perspective for pruning neural networks at initialization
title_fullStr A signal propagation perspective for pruning neural networks at initialization
title_full_unstemmed A signal propagation perspective for pruning neural networks at initialization
title_short A signal propagation perspective for pruning neural networks at initialization
title_sort signal propagation perspective for pruning neural networks at initialization
work_keys_str_mv AT leen asignalpropagationperspectiveforpruningneuralnetworksatinitialization
AT ajanthant asignalpropagationperspectiveforpruningneuralnetworksatinitialization
AT goulds asignalpropagationperspectiveforpruningneuralnetworksatinitialization
AT torrphs asignalpropagationperspectiveforpruningneuralnetworksatinitialization
AT leen signalpropagationperspectiveforpruningneuralnetworksatinitialization
AT ajanthant signalpropagationperspectiveforpruningneuralnetworksatinitialization
AT goulds signalpropagationperspectiveforpruningneuralnetworksatinitialization
AT torrphs signalpropagationperspectiveforpruningneuralnetworksatinitialization