On the Complexity of Neural Computation in Superposition
Recent advances in the understanding of neural networks suggest that superposition, the ability of a single neuron to represent multiple features simultaneously, is a key mechanism underlying the computational efficiency of large-scale networks. This paper explores the theoretical foundations of com...
Main Authors: | Adler, Micah, Shavit, Nir |
---|---|
Format: | Article |
Language: | en_US |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/1721.1/157073 |
Similar Items
-
Parameter optimization of evolving spiking neural networks using improved firefly algorithm for classification tasks /
by: Farezdzuan Roslan, 1992-, author 616568, et al.
Published: (2018) -
Parameter optimization of evolving spiking neural networks using improved firefly algorithm for classification tasks /
by: Farezdzuan Roslan, 1992-, author 616568
Published: (2018) -
On Generalization Bounds for Neural Networks with Low Rank Layers
by: Pinto, Andrea, et al.
Published: (2024) -
Volatility autocorrelation in the stock market with artificial neural networks
by: Tham, Zhi Rong
Published: (2024) -
A neural circuit for excessive feeding driven by environmental context in mice
by: Mohammad, Hasan, et al.
Published: (2022)