Feature learning in deep classifiers through Intermediate Neural Collapse
In this paper, we conduct an empirical study of the feature learning process in deep classifiers. Recent research has identified a training phenomenon called Neural Collapse (NC), in which the top-layer feature embeddings of samples from the same class tend to concentrate around their means, and the...
Main Authors: | Rangamani, Akshay, Lindegaard, Marius, Galanti, Tomer, Poggio, Tomaso |
---|---|
Format: | Article |
Published: |
Center for Brains, Minds and Machines (CBMM)
2023
|
Online Access: | https://hdl.handle.net/1721.1/148239 |
Similar Items
-
SGD Noise and Implicit Low-Rank Bias in Deep Neural Networks
by: Galanti, Tomer, et al.
Published: (2022) -
The Janus effects of SGD vs GD: high noise and low rank
by: Xu, Mengjia, et al.
Published: (2023) -
On Generalization Bounds for Neural Networks with Low Rank Layers
by: Pinto, Andrea, et al.
Published: (2024) -
SGD and Weight Decay Provably Induce a Low-Rank Bias in Deep Neural Networks
by: Galanti, Tomer, et al.
Published: (2023) -
Norm-Based Generalization Bounds for Compositionally Sparse Neural Network
by: Galanti, Tomer, et al.
Published: (2023)