Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is n...
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Institute of Automation, Chinese Academy of Sciences
2017
|
Online Access: | http://hdl.handle.net/1721.1/107679 https://orcid.org/0000-0002-3944-0455 |
_version_ | 1811075472677666816 |
---|---|
author | Mhaskar, Hrushikesh Rosasco, Lorenzo Miranda, Brando Liao, Qianli Poggio, Tomaso A |
author2 | Center for Brains, Minds and Machines at MIT |
author_facet | Center for Brains, Minds and Machines at MIT Mhaskar, Hrushikesh Rosasco, Lorenzo Miranda, Brando Liao, Qianli Poggio, Tomaso A |
author_sort | Mhaskar, Hrushikesh |
collection | MIT |
description | The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures. |
first_indexed | 2024-09-23T10:06:39Z |
format | Article |
id | mit-1721.1/107679 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T10:06:39Z |
publishDate | 2017 |
publisher | Institute of Automation, Chinese Academy of Sciences |
record_format | dspace |
spelling | mit-1721.1/1076792022-09-30T18:59:55Z Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review Mhaskar, Hrushikesh Rosasco, Lorenzo Miranda, Brando Liao, Qianli Poggio, Tomaso A Center for Brains, Minds and Machines at MIT Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences McGovern Institute for Brain Research at MIT Poggio, Tomaso A The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures. McGovern Institute for Brain Research at MIT. Center for Brains, Minds, and Machines National Science Foundation (U.S.) (STC award CCF (No. 1231216)) United States. Army Research Office (No. W911NF-15-1-0385) 2017-03-23T19:40:31Z 2017-03-23T19:40:31Z 2017-03 2017-03-15T04:36:01Z Article http://purl.org/eprint/type/JournalArticle 1476-8186 1751-8520 http://hdl.handle.net/1721.1/107679 Poggio, Tomaso, Hrushikesh Mhaskar, Lorenzo Rosasco, Brando Miranda, and Qianli Liao. “Why and When Can Deep-but Not Shallow-Networks Avoid the Curse of Dimensionality: A Review.” International Journal of Automation and Computing (March 14, 2017). https://orcid.org/0000-0002-3944-0455 en http://dx.doi.org/10.1007/s11633-017-1054-2 International Journal of Automation and Computing Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ The Author(s) application/pdf Institute of Automation, Chinese Academy of Sciences Springer |
spellingShingle | Mhaskar, Hrushikesh Rosasco, Lorenzo Miranda, Brando Liao, Qianli Poggio, Tomaso A Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title_full | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title_fullStr | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title_full_unstemmed | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title_short | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
title_sort | why and when can deep but not shallow networks avoid the curse of dimensionality a review |
url | http://hdl.handle.net/1721.1/107679 https://orcid.org/0000-0002-3944-0455 |
work_keys_str_mv | AT mhaskarhrushikesh whyandwhencandeepbutnotshallownetworksavoidthecurseofdimensionalityareview AT rosascolorenzo whyandwhencandeepbutnotshallownetworksavoidthecurseofdimensionalityareview AT mirandabrando whyandwhencandeepbutnotshallownetworksavoidthecurseofdimensionalityareview AT liaoqianli whyandwhencandeepbutnotshallownetworksavoidthecurseofdimensionalityareview AT poggiotomasoa whyandwhencandeepbutnotshallownetworksavoidthecurseofdimensionalityareview |