Simple, fast, and flexible framework for matrix completion with infinite width neural networks
<jats:title>Significance</jats:title> <jats:p>Matrix completion is a fundamental problem in machine learning that arises in various applications. We envision that our infinite width neural network framework for matrix completion will be easily deployable and produce stro...
Main Authors: | Radhakrishnan, Adityanarayanan, Stefanakis, George, Belkin, Mikhail, Uhler, Caroline |
---|---|
Other Authors: | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems |
Format: | Article |
Language: | English |
Published: |
Proceedings of the National Academy of Sciences
2022
|
Online Access: | https://hdl.handle.net/1721.1/143919 |
Similar Items
-
Overparameterized neural networks implement associative memory
by: Radhakrishnan, Adityanarayanan, et al.
Published: (2021) -
Theory and Applications of Matrix Completion in Genomics Datasets
by: Stefanakis, George
Published: (2022) -
Counting Markov equivalence classes for DAG models on trees
by: Radhakrishnan, Adityanarayanan, et al.
Published: (2022) -
Counting Markov equivalence classes for DAG models on trees
by: Radhakrishnan, Adityanarayanan, et al.
Published: (2021) -
Counting Markov equivalence classes by number of immoralities
by: Radhakrishnan, Adityanarayanan, et al.
Published: (2020)