Self-Assembly of a Biologically Plausible Learning Circuit
Over the last four decades, the amazing success of deep learning has been driven by the use of Stochastic Gradient Descent (SGD) as the main optimization technique. The default implementation for the computation of the gradient for SGD is backpropagation, which, with its variations, is used to this...
Main Authors: | Liao, Qianli, Ziyin, Liu, Gan, Yulu, Cheung, Brian, Harnett, Mark, Poggio, Tomaso |
---|---|
Format: | Article |
Published: |
Center for Brains, Minds and Machines (CBMM)
2024
|
Online Access: | https://hdl.handle.net/1721.1/157934 |
Similar Items
-
A Homogeneous Transformer Architecture
by: Gan, Yulu, et al.
Published: (2023) -
On the Power of Decision Trees in Auto-Regressive Language Modeling
by: Gan, Yulu, et al.
Published: (2024) -
Formation of Representations in Neural Networks
by: Ziyin, Liu, et al.
Published: (2024) -
Efficient multi-scale representation of visual objects using a biologically plausible spike-latency code and winner-take-all inhibition
by: Sanchez-Garcia, Melani, et al.
Published: (2023) -
On Generalization Bounds for Neural Networks with Low Rank Layers
by: Pinto, Andrea, et al.
Published: (2024)