On Generalization Bounds for Neural Networks with Low Rank Layers
While previous optimization results have suggested that deep neural networks tend to favour low-rank weight matrices, the implications of this inductive bias on generalization bounds remain under-explored. In this paper, we apply a chain rule for Gaussian complexity (Maurer, 2016a) to analyze how lo...
Main Authors: | Pinto, Andrea, Rangamani, Akshay, Poggio, Tomaso |
---|---|
Format: | Article |
Published: |
Center for Brains, Minds and Machines (CBMM)
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/1721.1/157263 |
Similar Items
-
Rank weight hierarchy of some classes of polynomial codes
by: Ducoat, Jérôme, et al.
Published: (2023) -
On the Complexity of Neural Computation in Superposition
by: Adler, Micah, et al.
Published: (2024) -
Enhancing e-commerce recommender system adaptability with online deep controllable Learning-To-Rank
by: Zeng, Anxiang, et al.
Published: (2021) -
On List-Decodability of Random Rank Metric Codes and Subspace Codes
by: Ding, Yang
Published: (2016) -
On the clique number of a strongly regular graph
by: Greaves, Gary Royden Watson, et al.
Published: (2021)