Simple, fast, and flexible framework for matrix completion with infinite width neural networks
<jats:title>Significance</jats:title> <jats:p>Matrix completion is a fundamental problem in machine learning that arises in various applications. We envision that our infinite width neural network framework for matrix completion will be easily deployable and produce stro...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Proceedings of the National Academy of Sciences
2022
|
Online Access: | https://hdl.handle.net/1721.1/143919 |
_version_ | 1826212556714278912 |
---|---|
author | Radhakrishnan, Adityanarayanan Stefanakis, George Belkin, Mikhail Uhler, Caroline |
author2 | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems |
author_facet | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems Radhakrishnan, Adityanarayanan Stefanakis, George Belkin, Mikhail Uhler, Caroline |
author_sort | Radhakrishnan, Adityanarayanan |
collection | MIT |
description | <jats:title>Significance</jats:title>
<jats:p>Matrix completion is a fundamental problem in machine learning that arises in various applications. We envision that our infinite width neural network framework for matrix completion will be easily deployable and produce strong baselines for a wide range of applications at limited computational costs. We demonstrate the flexibility of our framework through competitive results on virtual drug screening and image inpainting/reconstruction. Simplicity and speed are showcased by the fact that most results in this work require only a central processing unit and commodity hardware. Through its connection to semisupervised learning, our framework provides a principled approach for matrix completion that can be easily applied to problems well beyond those of image completion and virtual drug screening considered in this paper.</jats:p> |
first_indexed | 2024-09-23T15:25:10Z |
format | Article |
id | mit-1721.1/143919 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T15:25:10Z |
publishDate | 2022 |
publisher | Proceedings of the National Academy of Sciences |
record_format | dspace |
spelling | mit-1721.1/1439192023-02-14T20:08:32Z Simple, fast, and flexible framework for matrix completion with infinite width neural networks Radhakrishnan, Adityanarayanan Stefanakis, George Belkin, Mikhail Uhler, Caroline Massachusetts Institute of Technology. Laboratory for Information and Decision Systems Massachusetts Institute of Technology. Institute for Data, Systems, and Society <jats:title>Significance</jats:title> <jats:p>Matrix completion is a fundamental problem in machine learning that arises in various applications. We envision that our infinite width neural network framework for matrix completion will be easily deployable and produce strong baselines for a wide range of applications at limited computational costs. We demonstrate the flexibility of our framework through competitive results on virtual drug screening and image inpainting/reconstruction. Simplicity and speed are showcased by the fact that most results in this work require only a central processing unit and commodity hardware. Through its connection to semisupervised learning, our framework provides a principled approach for matrix completion that can be easily applied to problems well beyond those of image completion and virtual drug screening considered in this paper.</jats:p> 2022-07-21T14:37:12Z 2022-07-21T14:37:12Z 2022-04-19 2022-07-21T13:49:40Z Article http://purl.org/eprint/type/JournalArticle https://hdl.handle.net/1721.1/143919 Radhakrishnan, Adityanarayanan, Stefanakis, George, Belkin, Mikhail and Uhler, Caroline. 2022. "Simple, fast, and flexible framework for matrix completion with infinite width neural networks." Proceedings of the National Academy of Sciences, 119 (16). en 10.1073/pnas.2115064119 Proceedings of the National Academy of Sciences Creative Commons Attribution-NonCommercial-NoDerivs License http://creativecommons.org/licenses/by-nc-nd/4.0/ application/pdf Proceedings of the National Academy of Sciences PNAS |
spellingShingle | Radhakrishnan, Adityanarayanan Stefanakis, George Belkin, Mikhail Uhler, Caroline Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title | Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title_full | Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title_fullStr | Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title_full_unstemmed | Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title_short | Simple, fast, and flexible framework for matrix completion with infinite width neural networks |
title_sort | simple fast and flexible framework for matrix completion with infinite width neural networks |
url | https://hdl.handle.net/1721.1/143919 |
work_keys_str_mv | AT radhakrishnanadityanarayanan simplefastandflexibleframeworkformatrixcompletionwithinfinitewidthneuralnetworks AT stefanakisgeorge simplefastandflexibleframeworkformatrixcompletionwithinfinitewidthneuralnetworks AT belkinmikhail simplefastandflexibleframeworkformatrixcompletionwithinfinitewidthneuralnetworks AT uhlercaroline simplefastandflexibleframeworkformatrixcompletionwithinfinitewidthneuralnetworks |