ASPDC: Accelerated SPDC Regularized Empirical Risk Minimization for Ill-Conditioned Problems in Large-Scale Machine Learning
This paper aims to improve the response speed of SPDC (stochastic primal–dual coordinate ascent) in large-scale machine learning, as the complexity of per-iteration of SPDC is not satisfactory. We propose an accelerated stochastic primal–dual coordinate ascent called ASPDC and its further accelerate...
Main Authors: | Haobang Liang, Hao Cai, Hejun Wu, Fanhua Shang, James Cheng, Xiying Li |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-07-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/11/15/2382 |
Similar Items
-
A primal-dual algorithm framework for convex saddle-point optimization
by: Benxin Zhang, et al.
Published: (2017-10-01) -
On the Number of Witnesses in the Miller–Rabin Primality Test
by: Shamil Talgatovich Ishmukhametov, et al.
Published: (2020-06-01) -
Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
by: Anatoli Juditsky, et al.
Published: (2014-09-01) -
An Inertial Modified S-Algorithm for Convex Minimization Problems with Directed Graphs and Its Applications in Classification Problems
by: Kobkoon Janngam, et al.
Published: (2022-11-01) -
A New Accelerated Fixed-Point Algorithm for Classification and Convex Minimization Problems in Hilbert Spaces with Directed Graphs
by: Kobkoon Janngam, et al.
Published: (2022-05-01)