One-Pass Learning via Bridging Orthogonal Gradient Descent and Recursive Least-Squares
While deep neural networks are capable of achieving state-of-the-art performance in various domains, their training typically requires iterating for many passes over the dataset. However, due to computational and memory constraints and potential privacy concerns, storing and accessing all the data i...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2025
|
Online Access: | https://hdl.handle.net/1721.1/158204 https://orcid.org/0000-0002-3737-1206 |