Stable rank normalization for improved generalization in neural networks and GANs
Exciting new work on generalization bounds for neural networks (NN) given by Bartlett et al. (2017); Neyshabur et al. (2018) closely depend on two parameter- dependant quantities: the Lipschitz constant upper bound and the stable rank (a softer version of rank). Even though these bounds typically ha...
Príomhchruthaitheoirí: | Sanyal, A, Torr, P, Dokania, P |
---|---|
Formáid: | Conference item |
Teanga: | English |
Foilsithe / Cruthaithe: |
International Conference on Learning Representations
2020
|
Míreanna comhchosúla
Míreanna comhchosúla
-
On using focal loss for neural network calibration
de réir: Mukhoti, J, et al.
Foilsithe / Cruthaithe: (2020) -
Calibrating deep neural networks using focal loss
de réir: Mukhoti, J, et al.
Foilsithe / Cruthaithe: (2020) -
On Generalization Bounds for Neural Networks with Low Rank Layers
de réir: Pinto, Andrea, et al.
Foilsithe / Cruthaithe: (2024) -
Proximal mean-field for neural network quantization
de réir: Ajanthan, T, et al.
Foilsithe / Cruthaithe: (2020) -
Are vision transformers always more robust than convolutional neural networks?
de réir: Pinto, F, et al.
Foilsithe / Cruthaithe: (2021)