Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?

[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"] The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learni...

Full description

Bibliographic Details
Main Authors: Poggio, Tomaso, Mhaskar, Hrushikesh, Rosasco, Lorenzo, Miranda, Brando, Liao, Qianli
Format: Technical Report
Language:en_US
Published: Center for Brains, Minds and Machines (CBMM), arXiv 2016
Subjects:
Online Access:http://hdl.handle.net/1721.1/105443
_version_ 1811074614230515712
author Poggio, Tomaso
Mhaskar, Hrushikesh
Rosasco, Lorenzo
Miranda, Brando
Liao, Qianli
author_facet Poggio, Tomaso
Mhaskar, Hrushikesh
Rosasco, Lorenzo
Miranda, Brando
Liao, Qianli
author_sort Poggio, Tomaso
collection MIT
description [formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"] The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.
first_indexed 2024-09-23T09:52:37Z
format Technical Report
id mit-1721.1/105443
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T09:52:37Z
publishDate 2016
publisher Center for Brains, Minds and Machines (CBMM), arXiv
record_format dspace
spelling mit-1721.1/1054432019-04-12T17:17:11Z Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality? Poggio, Tomaso Mhaskar, Hrushikesh Rosasco, Lorenzo Miranda, Brando Liao, Qianli Deep Learning deep convolutional networks [formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"] The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures. This work was supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC award CCF – 1231216. 2016-11-28T17:38:30Z 2016-11-28T17:38:30Z 2016-11-23 Technical Report Working Paper Other http://hdl.handle.net/1721.1/105443 arXiv:1611.00740v5 en_US CBMM Memo Series;058 Attribution-NonCommercial-ShareAlike 3.0 United States http://creativecommons.org/licenses/by-nc-sa/3.0/us/ application/pdf application/pdf Center for Brains, Minds and Machines (CBMM), arXiv
spellingShingle Deep Learning
deep convolutional networks
Poggio, Tomaso
Mhaskar, Hrushikesh
Rosasco, Lorenzo
Miranda, Brando
Liao, Qianli
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title_full Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title_fullStr Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title_full_unstemmed Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title_short Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
title_sort theory i why and when can deep networks avoid the curse of dimensionality
topic Deep Learning
deep convolutional networks
url http://hdl.handle.net/1721.1/105443
work_keys_str_mv AT poggiotomaso theoryiwhyandwhencandeepnetworksavoidthecurseofdimensionality
AT mhaskarhrushikesh theoryiwhyandwhencandeepnetworksavoidthecurseofdimensionality
AT rosascolorenzo theoryiwhyandwhencandeepnetworksavoidthecurseofdimensionality
AT mirandabrando theoryiwhyandwhencandeepnetworksavoidthecurseofdimensionality
AT liaoqianli theoryiwhyandwhencandeepnetworksavoidthecurseofdimensionality