One-Pass Learning via Bridging Orthogonal Gradient Descent and Recursive Least-Squares
While deep neural networks are capable of achieving state-of-the-art performance in various domains, their training typically requires iterating for many passes over the dataset. However, due to computational and memory constraints and potential privacy concerns, storing and accessing all the data i...
Main Author: | Min, Youngjae |
---|---|
Other Authors: | Azizan, Navid |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2025
|
Online Access: | https://hdl.handle.net/1721.1/158204 https://orcid.org/0000-0002-3737-1206 |
Similar Items
-
Orthogonal vs. uncorrelated least squares discriminant analysis for feature extraction
by: Nie, Feiping, et al.
Published: (2013) -
Active vibration control of flexible beam incorporating recursive least square and neural network algorithms.
by: Abd. Jalil, Nurhanafifi
Published: (2024) -
Least squares and the not-Normal Equations
by: Wathen, AJ
Published: (2025) -
Byzantine-resilient decentralized stochastic gradient descent
by: Guo, Shangwei, et al.
Published: (2024) -
Distributed Singular Value Decomposition Through
Least Squares
by: Zhao, Freddie
Published: (2024)