GMRES-Accelerated ADMM for Quadratic Objectives

© 2018 Society for Industrial and Applied Mathematics. We consider the sequence acceleration problem for the alternating direction method of multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subprob...

Full description

Bibliographic Details
Main Authors: Zhang, Richard Y, White, Jacob K
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: Society for Industrial & Applied Mathematics (SIAM) 2021
Online Access:https://hdl.handle.net/1721.1/135162
Description
Summary:© 2018 Society for Industrial and Applied Mathematics. We consider the sequence acceleration problem for the alternating direction method of multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the general minimum residual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a Κ -conditioned problem in O(√Κ) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(Κ 1 / 4 ) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(√Κ) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k 2 ), instead of O(1/k), where k is the iteration index.