Noisy Tensor Completion via the Sum-of-Squares Hierarchy

© 2016 B. Barak & A. Moitra. In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n1 × n2 × n3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observati...

Full description

Bibliographic Details
Main Authors: Barak, Boaz, Moitra, Ankur
Other Authors: Massachusetts Institute of Technology. Department of Mathematics
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137985
_version_ 1826188433616273408
author Barak, Boaz
Moitra, Ankur
author2 Massachusetts Institute of Technology. Department of Mathematics
author_facet Massachusetts Institute of Technology. Department of Mathematics
Barak, Boaz
Moitra, Ankur
author_sort Barak, Boaz
collection MIT
description © 2016 B. Barak & A. Moitra. In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n1 × n2 × n3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = max(n1, n2, n3). We show that if m = n3/2r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T's entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n3/2−ε . Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy?
first_indexed 2024-09-23T07:59:34Z
format Article
id mit-1721.1/137985
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T07:59:34Z
publishDate 2021
record_format dspace
spelling mit-1721.1/1379852022-09-23T10:08:37Z Noisy Tensor Completion via the Sum-of-Squares Hierarchy Barak, Boaz Moitra, Ankur Massachusetts Institute of Technology. Department of Mathematics Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory © 2016 B. Barak & A. Moitra. In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n1 × n2 × n3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = max(n1, n2, n3). We show that if m = n3/2r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T's entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n3/2−ε . Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy? 2021-11-09T17:05:02Z 2021-11-09T17:05:02Z 2016-02 2019-11-14T20:47:27Z Article http://purl.org/eprint/type/JournalArticle https://hdl.handle.net/1721.1/137985 Barak, Boaz and Moitra, Ankur. 2016. "Noisy Tensor Completion via the Sum-of-Squares Hierarchy." en http://proceedings.mlr.press/v49/barak16.pdf Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf arXiv
spellingShingle Barak, Boaz
Moitra, Ankur
Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title_full Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title_fullStr Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title_full_unstemmed Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title_short Noisy Tensor Completion via the Sum-of-Squares Hierarchy
title_sort noisy tensor completion via the sum of squares hierarchy
url https://hdl.handle.net/1721.1/137985
work_keys_str_mv AT barakboaz noisytensorcompletionviathesumofsquareshierarchy
AT moitraankur noisytensorcompletionviathesumofsquareshierarchy