When cyclic coordinate descent outperforms randomized coordinate descent
The coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of interest because of its competitive performance in machine learning applications. A number of recent papers provided convergence rate estimates for their deterministic (cyclic) and randomized variants...
Main Authors: | Gurbuzbalaban, Mert, Ozdaglar, Asuman E, Parrilo, Pablo A., Vanli, Nuri Denizcan |
---|---|
Other Authors: | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems |
Format: | Article |
Language: | English |
Published: |
Neural Information Processing Systems Foundation, Inc.
2019
|
Online Access: | https://hdl.handle.net/1721.1/121536 |
Similar Items
-
Randomness and permutations in coordinate descent methods
by: Gürbüzbalaban, Mert, et al.
Published: (2021) -
Why random reshuffling beats stochastic gradient descent
by: Gürbüzbalaban, M., et al.
Published: (2021) -
Convergence rate of block-coordinate maximization Burer–Monteiro method for solving large SDPs
by: Erdogdu, Murat A, et al.
Published: (2022) -
Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
by: Vanli, Nuri Denizcan, et al.
Published: (2019) -
Accelerating greedy coordinate descent methods
by: Lu, Haihao, et al.
Published: (2019)