Relatively Smooth Convex Optimization by First-Order Methods, and Applications
The usual approach to developing and analyzing first-order methods for smooth convex optimization assumes that the gradient of the objective function is uniformly smooth with some Lipschitz constant L. However, in many settings the differentiable convex function f(?) is not uniformly smooth-for exam...
Main Authors: | Nesterov, Yurii, Lu, Haihao, Freund, Robert Michael |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Mathematics |
Format: | Article |
Published: |
Society for Industrial & Applied Mathematics (SIAM)
2019
|
Online Access: | http://hdl.handle.net/1721.1/120867 https://orcid.org/0000-0002-5217-1894 https://orcid.org/0000-0002-1733-5363 |
Similar Items
-
New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
by: Freund, Robert Michael, et al.
Published: (2018) -
Acceleration in first-order optimization methods: promenading beyond convexity or smoothness, and applications
by: Martinez Rubio, D
Published: (2021) -
Generalized stochastic Frank–Wolfe algorithm with stochastic “substitute” gradient for structured convex optimization
by: Lu, Haihao, et al.
Published: (2021) -
AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods
by: Freund, Robert M., et al.
Published: (2014) -
An extrapolated fixed-point optimization method for strongly convex smooth optimizations
by: Duangdaw Rakjarungkiat, et al.
Published: (2024-01-01)