Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O ( ε - log ρ 2 ) = O ( ε - 0.7864...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Springer Berlin Heidelberg
2024
|
Online Access: | https://hdl.handle.net/1721.1/157702 |