Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"] The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learni...
Main Authors: | Poggio, Tomaso, Mhaskar, Hrushikesh, Rosasco, Lorenzo, Miranda, Brando, Liao, Qianli |
---|---|
Format: | Technical Report |
Language: | en_US |
Published: |
Center for Brains, Minds and Machines (CBMM), arXiv
2016
|
Subjects: | |
Online Access: | http://hdl.handle.net/1721.1/105443 |
Similar Items
-
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
by: Mhaskar, Hrushikesh, et al.
Published: (2017) -
Theory I: Deep networks and the curse of dimensionality
by: T. Poggio, et al.
Published: (2018-12-01) -
Deep vs. shallow networks : An approximation theory perspective
by: Mhaskar, Hrushikesh, et al.
Published: (2016) -
When Is Handcrafting Not a Curse?
by: Liao, Qianli, et al.
Published: (2018) -
An analysis of training and generalization errors in shallow and deep networks
by: Mhaskar, Hrushikesh, et al.
Published: (2018)