Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^...
Main Author: | Chizat, Lénaïc |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2023-01-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/ |
Similar Items
-
Convexity and optimization in banach spaces /
by: 326812 Barbu, Viorel, et al.
Published: (1986) -
Biobjective convex programs in Banach space /
by: 327351 Lim, Kim-Pin
Published: (1986) -
Adaptive Stochastic Gradient Descent Method for Convex and Non-Convex Optimization
by: Ruijuan Chen, et al.
Published: (2022-11-01) -
Strict convexity and complex strict convexity : theory and applications /
by: 313374 Istratescu , Vasile I.
Published: (1984) -
Analysis of a Two-Step Gradient Method with Two Momentum Parameters for Strongly Convex Unconstrained Optimization
by: Gerasim V. Krivovichev, et al.
Published: (2024-03-01)