Accelerated algorithms for constrained optimization and control

Nonlinear optimization with equality and inequality constraints is a ubiquitous problem in several optimization and control problems in large-scale systems. Ensuring feasibility along with reasonable convergence to optimal solution remains an open and pressing problem in this area. A class of hi...

Full description

Bibliographic Details
Main Author: Parashar, Anjali
Other Authors: Annaswamy, Anuradha M.
Format: Thesis
Published: Massachusetts Institute of Technology 2023
Online Access:https://hdl.handle.net/1721.1/152459
Description
Summary:Nonlinear optimization with equality and inequality constraints is a ubiquitous problem in several optimization and control problems in large-scale systems. Ensuring feasibility along with reasonable convergence to optimal solution remains an open and pressing problem in this area. A class of high-order tuners was recently proposed in adaptive control literature with an effort to lead to accelerated convergence for the case when no constraints are present. In this thesis, we propose a new high-order tuner based algorithm that can accommodate the presence of equality and inequality constraints. We leverage the linear dependence in solution space to guarantee that equality constraints are always satisfied. We further ensure feasibility with respect to inequality constraints for the specific case of box constraints by introducing time-varying gains in the high-order tuner while retaining the attractive accelerated convergence properties. Theoretical guarantees pertaining to stability are also provided for time-varying regressors. These theoretical propositions are validated by applying them to several categories of optimization problems, in the form of academic examples, power flow optimization and neural network optimization. We devote special attention to analyze a special case of neural network optimization, namely, linear neural network training problem, to understand the dynamics of nonconvex optimization governed by gradient flow and provide lyapunov stability guarantees for LNNs.