Small steps and giant leaps: minimal newton solvers for deep learning
We propose a fast second-order method that can be used as a drop-in replacement for current deep learning solvers. Compared to stochastic gradient descent (SGD), it only requires two additional forward-mode automatic differentiation operations per iteration, which has a computational cost comparable...
Main Authors: | Henriques, J, Ehrhardt, S, Albanie, S, Vedaldi, A |
---|---|
Format: | Conference item |
Language: | English |
Published: |
IEEE
2020
|
Similar Items
-
One small step can lead to one giant leap
by: Michael Frumovitz, et al.
Published: (2022-08-01) -
One small step for e-voting, one giant leap for democracy
by: Marina Gorbatiuc
Published: (2020-03-01) -
One small step for exosomes, one giant leap for Kawasaki disease
by: Henrique Girão
Published: (2016-05-01) -
Meta-learning with differentiable closed-form solvers
by: Bertinetto, L, et al.
Published: (2019) -
Stepped collaborative care for trauma: giant leaps for health equity
by: Lisa M Kodadek, et al.
Published: (2024-02-01)