Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method
We give a new approach to the dictionary learning (also known as “sparse coding”) problem of recovering an unknown n × m matrix A (for m ≥ n) from examples of the form [y = Ax + e], where x is a random vector in R[superscript m] with at most τ m nonzero coordinates, and e is a random noise vector in...
Main Authors: | Barak, Boaz, Steurer, David, Kelner, Jonathan Adam |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Mathematics |
Format: | Article |
Language: | en_US |
Published: |
Association for Computing Machinery
2016
|
Online Access: | http://hdl.handle.net/1721.1/105133 https://orcid.org/0000-0002-4257-4198 |
Similar Items
-
Rounding sum-of-squares relaxations
by: Barak, Boaz, et al.
Published: (2015) -
Noisy Tensor Completion via the Sum-of-Squares Hierarchy
by: Barak, Boaz, et al.
Published: (2021) -
Noisy tensor completion via the sum-of-squares hierarchy
by: Barak, Boaz, et al.
Published: (2022) -
Hypercontractivity, sum-of-squares proofs, and their applications
by: Barak, Boaz, et al.
Published: (2013) -
A Nearly Tight Sum-of-Squares Lower Bound for the Planted Clique Problem
by: Barak, Boaz, et al.
Published: (2018)