Stable rank normalization for improved generalization in neural networks and GANs
Exciting new work on generalization bounds for neural networks (NN) given by Bartlett et al. (2017); Neyshabur et al. (2018) closely depend on two parameter- dependant quantities: the Lipschitz constant upper bound and the stable rank (a softer version of rank). Even though these bounds typically ha...
Main Authors: | Sanyal, A, Torr, P, Dokania, P |
---|---|
格式: | Conference item |
語言: | English |
出版: |
International Conference on Learning Representations
2020
|
相似書籍
-
On using focal loss for neural network calibration
由: Mukhoti, J, et al.
出版: (2020) -
Calibrating deep neural networks using focal loss
由: Mukhoti, J, et al.
出版: (2020) -
On Generalization Bounds for Neural Networks with Low Rank Layers
由: Pinto, Andrea, et al.
出版: (2024) -
Proximal mean-field for neural network quantization
由: Ajanthan, T, et al.
出版: (2020) -
Are vision transformers always more robust than convolutional neural networks?
由: Pinto, F, et al.
出版: (2021)