Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2023-01-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/ |