High-order tuners for convex optimization : stability and accelerated learning

Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, February, 2021

Bibliographic Details
Main Author: Moreu Gamazo, José M.(José María)
Other Authors: Anuradha Annaswamy.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2021
Subjects:
Online Access:https://hdl.handle.net/1721.1/130859
_version_ 1826207543371759616
author Moreu Gamazo, José M.(José María)
author2 Anuradha Annaswamy.
author_facet Anuradha Annaswamy.
Moreu Gamazo, José M.(José María)
author_sort Moreu Gamazo, José M.(José María)
collection MIT
description Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, February, 2021
first_indexed 2024-09-23T13:51:30Z
format Thesis
id mit-1721.1/130859
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T13:51:30Z
publishDate 2021
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1308592021-05-26T03:16:54Z High-order tuners for convex optimization : stability and accelerated learning Moreu Gamazo, José M.(José María) Anuradha Annaswamy. Massachusetts Institute of Technology. Department of Mechanical Engineering. Massachusetts Institute of Technology. Department of Mechanical Engineering Mechanical Engineering. Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, February, 2021 Cataloged from the official PDF version of thesis. Includes bibliographical references (pages 131-132). Iterative gradient-based algorithms have been increasingly applied for the training of a broad variety of machine learning models including large neural-nets. In particular, momentum-based methods, with accelerated learning guarantees, have received a lot of attention due to their provable guarantees of fast learning in certain classes of problems and multiple algorithms have been derived. However, properties for these methods hold true only for constant regressors. When time-varying regressors occur, which is commonplace in dynamic systems, many of these momentum-based methods cannot guarantee stability. Recently, a new High-order Tuner (HT) was developed and shown to have 1) stability and asymptotic convergence for time-varying regressors and 2) non-asymptotic accelerated learning guarantees for constant regressors. These results were derived for a linear regression framework producing a quadratic loss function. This thesis extends and discuss the results of this same HT for general smooth convex loss functions. Through the exploitation of convexity and smoothness definitions, we establish similar stability and asymptotic convergence guarantees. Additionally we conjecture that the HT has an accelerated convergence rate. Finally, we provide numerical simulations supporting the satisfactory behavior of the HT algorithm as well as the conjecture of accelerated learning. by José M. Moreu Gamazo. S.M. S.M. Massachusetts Institute of Technology, Department of Mechanical Engineering 2021-05-25T18:23:26Z 2021-05-25T18:23:26Z 2021 2021 Thesis https://hdl.handle.net/1721.1/130859 1252630722 eng MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. http://dspace.mit.edu/handle/1721.1/7582 132 pages application/pdf Massachusetts Institute of Technology
spellingShingle Mechanical Engineering.
Moreu Gamazo, José M.(José María)
High-order tuners for convex optimization : stability and accelerated learning
title High-order tuners for convex optimization : stability and accelerated learning
title_full High-order tuners for convex optimization : stability and accelerated learning
title_fullStr High-order tuners for convex optimization : stability and accelerated learning
title_full_unstemmed High-order tuners for convex optimization : stability and accelerated learning
title_short High-order tuners for convex optimization : stability and accelerated learning
title_sort high order tuners for convex optimization stability and accelerated learning
topic Mechanical Engineering.
url https://hdl.handle.net/1721.1/130859
work_keys_str_mv AT moreugamazojosemjosemaria highordertunersforconvexoptimizationstabilityandacceleratedlearning