Controlled gradient descent: A control theoretical perspective for optimization

The Gradient Descent (GD) paradigm is a foundational principle of modern optimization algorithms. The GD algorithm and its variants, including accelerated optimization algorithms, geodesic optimization, natural gradient, and contraction-based optimization, to name a few, are used in machine learning...

Full description

Bibliographic Details
Main Authors: Revati Gunjal, Syed Shadab Nayyer, S.R. Wagh, N.M. Singh
Format: Article
Language:English
Published: Elsevier 2024-06-01
Series:Results in Control and Optimization
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S266672072400047X