Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is n...
Main Authors: | Mhaskar, Hrushikesh, Rosasco, Lorenzo, Miranda, Brando, Liao, Qianli, Poggio, Tomaso A |
---|---|
Other Authors: | Center for Brains, Minds and Machines at MIT |
Format: | Article |
Language: | English |
Published: |
Institute of Automation, Chinese Academy of Sciences
2017
|
Online Access: | http://hdl.handle.net/1721.1/107679 https://orcid.org/0000-0002-3944-0455 |
Similar Items
-
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
by: Poggio, Tomaso, et al.
Published: (2016) -
Learning Real and Boolean Functions: When Is Deep Better Than Shallow
by: Mhaskar, Hrushikesh, et al.
Published: (2016) -
When Is Handcrafting Not a Curse?
by: Liao, Qianli, et al.
Published: (2018) -
An analysis of training and generalization errors in shallow and deep networks
by: Mhaskar, Hrushikesh, et al.
Published: (2018) -
Deep vs. shallow networks : An approximation theory perspective
by: Mhaskar, Hrushikesh, et al.
Published: (2016)