-
1
Deep vs. shallow networks : An approximation theory perspective
Published 2016Get full text
Technical Report -
2
-
3
On the Routh approximation technique and least squares errors
Published 2002Subjects: Get full text
-
4
-
5
-
6
The selection of network functions to approximate prescribed frequency characteristics
Published 2004Subjects: Get full text
-
7
Hierarchical aggregation of linear systems with multiple time scales
Published 2002Subjects: Get full text
-
8
Segmented approximation and analysis of stochastic processes.
Published 2023Subjects: Get full text
Thesis -
9
An Overview of Some Issues in the Theory of Deep Networks
Published 2021“…We review our contributions in the areas of approximation theory and optimization. We also introduce a new approach based on cross-validation leave-one-out stability to estimate bounds on the expected error of overparametrized classifiers, such as deep networks. © 2020 Institute of Electrical Engineers of Japan. …”
Get full text
Article -
10
Networks and the Best Approximation Property
Published 2004“…From the point of view of approximation theory, however, the property of approximating continous functions arbitrarily well is not sufficient for characterizing good approximation schemes. …”
Get full text
-
11
On the Convergence of Stochastic Iterative Dynamic Programming Algorithms
Published 2004“…In this paper we provide a rigorous proof of convergence of these DP-based learning algorithms by relating them to the powerful techniques of stochastic approximation theory via a new convergence theorem. The theorem establishes a general class of convergent algorithms to which both TD(lambda) and Q-learning belong.…”
Get full text
-
12
Bounds on Urysohn width
Published 2022“…This notion was introduced in the context of dimension theory, used in approximation theory, appeared in the work of Gromov on systolic geometry, and nowadays it is a metric invariant of independent interest. …”
Get full text
Get full text
Thesis -
13
An Overview of Some Issues in the Theory of Deep Networks
Published 2022“…We review our contributions in the areas of approximation theory and optimization. We also introduce a new approach based on cross-validation leave-one-out stability to estimate bounds on the expected error of overparametrized classifiers, such as deep networks. © 2020 Institute of Electrical Engineers of Japan. …”
Get full text
Article -
14
Hierarchically Local Tasks and Deep Convolutional Networks
Published 2020“…Recent results in approximation theory have shown that there is an exponential advantage of deep convolutional-like networks in approximating functions with hierarchical locality in their compositional structure. …”
Get full text
Technical Report -
15
How Many Subpopulations Is Too Many? Exponential Lower Bounds for Inferring Population Histories
Published 2021“…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure - the history of multiple subpopulations that merge, split, and change sizes over time. …”
Get full text
Article -
16
How Many Subpopulations Is Too Many? Exponential Lower Bounds for Inferring Population Histories
Published 2022“…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure - the history of multiple subpopulations that merge, split, and change sizes over time. …”
Get full text
Article -
17
How Many Subpopulations Is Too Many?: Exponential Lower Bounds for Inferring Population Histories
Published 2020“…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure—the history of multiple subpopulations that merge, split and change sizes over time. …”
Get full text
Book -
18
Theoretical issues in deep networks
Published 2021“…We review our recent results toward this goal. In approximation theory both shallow and deep networks are known to approximate any continuous functions at an exponential cost. …”
Get full text
Article -
19
Theoretical Issues in Deep Networks
Published 2019“…We review our recent results towards this goal. In {\it approximation theory} both shallow and deep networks are known to approximate any continuous functions on a bounded domain at a cost which is exponential (the number of parameters is exponential in the dimensionality of the function). …”
Get full text
Technical Report -
20
Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
Published 2015“…Our approach introduces local approximations of these models into the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. …”
Get full text
Get full text
Article