A signal propagation perspective for pruning neural networks at initialization
Network pruning is a promising avenue for compressing deep neural networks. A typical approach to pruning starts by training a model and then removing redundant parameters while minimizing the impact on what is learned. Alternatively, a recent approach shows that pruning can be done at initializatio...
Main Authors: | Lee, N, Ajanthan, T, Gould, S, Torr, PHS |
---|---|
Format: | Conference item |
Language: | English |
Published: |
International Conference on Learning Representations
2019
|
Similar Items
-
SNIP: single-shot network pruning based on connection sensitivity
by: Lee, N, et al.
Published: (2019) -
Data parallelism in training sparse neural networks
by: Lee, N, et al.
Published: (2020) -
Understanding the effects of data parallelism and sparsity on neural network training
by: Lee, N, et al.
Published: (2020) -
Mirror Descent view for Neural Network quantization
by: Ajanthan, T, et al.
Published: (2021) -
Proximal mean-field for neural network quantization
by: Ajanthan, T, et al.
Published: (2020)