Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.

We consider here the class of supervised learning algorithms known as Empirical Risk Minimization (ERM). The classical theory by Vapnik and others characterize universal consistency of ERM in the classical regime in which the architecture of the learning network is fixed and n, the number of trainin...

Full description

Bibliographic Details
Main Author: Poggio, Tomaso
Format: Technical Report
Published: Center for Brains, Minds and Machines (CBMM) 2020
Online Access:https://hdl.handle.net/1721.1/124343
_version_ 1811076443077083136
author Poggio, Tomaso
author_facet Poggio, Tomaso
author_sort Poggio, Tomaso
collection MIT
description We consider here the class of supervised learning algorithms known as Empirical Risk Minimization (ERM). The classical theory by Vapnik and others characterize universal consistency of ERM in the classical regime in which the architecture of the learning network is fixed and n, the number of training examples, goes to infinity. According to the classical theory, the minimizer of the empirical risk is consistent if the hypothesis space has finite complexity. We do not have a similar general theory for the modern regime of interpolating regressors and over-parameterized deep networks, in which d > n and 𝑑/n remains constant as n goes to infinity. In this note I propose the outline of such a theory based on the specific notion of CVloo stability of the learning algorithm with respect to perturbations of the training set. The theory shows that for interpolating regressors and separating classifiers (either kernel machines or deep RELU networks) 1. minimizing CVloo stability minimizes the expected error 2. the most stable solutions are minimum norm solutions The hope is that this approach may lead to a unified theory encompassing both the modern regime and the classical one.
first_indexed 2024-09-23T10:22:14Z
format Technical Report
id mit-1721.1/124343
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T10:22:14Z
publishDate 2020
publisher Center for Brains, Minds and Machines (CBMM)
record_format dspace
spelling mit-1721.1/1243432020-04-10T04:04:28Z Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime. Poggio, Tomaso We consider here the class of supervised learning algorithms known as Empirical Risk Minimization (ERM). The classical theory by Vapnik and others characterize universal consistency of ERM in the classical regime in which the architecture of the learning network is fixed and n, the number of training examples, goes to infinity. According to the classical theory, the minimizer of the empirical risk is consistent if the hypothesis space has finite complexity. We do not have a similar general theory for the modern regime of interpolating regressors and over-parameterized deep networks, in which d > n and 𝑑/n remains constant as n goes to infinity. In this note I propose the outline of such a theory based on the specific notion of CVloo stability of the learning algorithm with respect to perturbations of the training set. The theory shows that for interpolating regressors and separating classifiers (either kernel machines or deep RELU networks) 1. minimizing CVloo stability minimizes the expected error 2. the most stable solutions are minimum norm solutions The hope is that this approach may lead to a unified theory encompassing both the modern regime and the classical one. This work was supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC award CCF-1231216. 2020-03-25T20:02:45Z 2020-03-25T20:02:45Z 2020-03-25 Technical Report Working Paper Other https://hdl.handle.net/1721.1/124343 CBMM Memo;103 application/pdf Center for Brains, Minds and Machines (CBMM)
spellingShingle Poggio, Tomaso
Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title_full Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title_fullStr Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title_full_unstemmed Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title_short Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
title_sort stable foundations for learning a foundational framework for learning theory in both the classical and modern regime
url https://hdl.handle.net/1721.1/124343
work_keys_str_mv AT poggiotomaso stablefoundationsforlearningafoundationalframeworkforlearningtheoryinboththeclassicalandmodernregime