Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization

We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O ( ε - log ρ 2 ) = O ( ε - 0.7864...

Full description

Bibliographic Details
Main Authors: Altschuler, Jason M., Parrilo, Pablo A.
Other Authors: Massachusetts Institute of Technology. Laboratory for Information and Decision Systems
Format: Article
Language:English
Published: Springer Berlin Heidelberg 2024
Online Access:https://hdl.handle.net/1721.1/157702
_version_ 1824458007398842368
author Altschuler, Jason M.
Parrilo, Pablo A.
author2 Massachusetts Institute of Technology. Laboratory for Information and Decision Systems
author_facet Massachusetts Institute of Technology. Laboratory for Information and Decision Systems
Altschuler, Jason M.
Parrilo, Pablo A.
author_sort Altschuler, Jason M.
collection MIT
description We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O ( ε - log ρ 2 ) = O ( ε - 0.7864 ) iterations, where ρ = 1 + 2 is the silver ratio. This is intermediate between the textbook unaccelerated rate O ( ε - 1 ) and the accelerated rate O ( ε - 1 / 2 ) due to Nesterov in 1983. The Silver Stepsize Schedule is a simple explicit fractal: the i-th stepsize is 1 + ρ ν ( i ) - 1 where ν ( i ) is the 2-adic valuation of i. The design and analysis are conceptually identical to the strongly convex setting in our companion paper, but simplify remarkably in this specific setting.
first_indexed 2025-02-19T04:19:03Z
format Article
id mit-1721.1/157702
institution Massachusetts Institute of Technology
language English
last_indexed 2025-02-19T04:19:03Z
publishDate 2024
publisher Springer Berlin Heidelberg
record_format dspace
spelling mit-1721.1/1577022024-12-23T05:07:35Z Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization Altschuler, Jason M. Parrilo, Pablo A. Massachusetts Institute of Technology. Laboratory for Information and Decision Systems We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O ( ε - log ρ 2 ) = O ( ε - 0.7864 ) iterations, where ρ = 1 + 2 is the silver ratio. This is intermediate between the textbook unaccelerated rate O ( ε - 1 ) and the accelerated rate O ( ε - 1 / 2 ) due to Nesterov in 1983. The Silver Stepsize Schedule is a simple explicit fractal: the i-th stepsize is 1 + ρ ν ( i ) - 1 where ν ( i ) is the 2-adic valuation of i. The design and analysis are conceptually identical to the strongly convex setting in our companion paper, but simplify remarkably in this specific setting. 2024-12-02T18:51:34Z 2024-12-02T18:51:34Z 2024-11-25 2024-12-01T04:16:27Z Article http://purl.org/eprint/type/JournalArticle https://hdl.handle.net/1721.1/157702 Altschuler, J.M., Parrilo, P.A. Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization. Math. Program. (2024). PUBLISHER_CC en https://doi.org/10.1007/s10107-024-02164-2 Mathematical Programming Creative Commons Attribution https://creativecommons.org/licenses/by/4.0/ The Author(s) application/pdf Springer Berlin Heidelberg Springer Berlin Heidelberg
spellingShingle Altschuler, Jason M.
Parrilo, Pablo A.
Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title_full Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title_fullStr Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title_full_unstemmed Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title_short Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
title_sort acceleration by stepsize hedging silver stepsize schedule for smooth convex optimization
url https://hdl.handle.net/1721.1/157702
work_keys_str_mv AT altschulerjasonm accelerationbystepsizehedgingsilverstepsizescheduleforsmoothconvexoptimization
AT parrilopabloa accelerationbystepsizehedgingsilverstepsizescheduleforsmoothconvexoptimization