Showing 1 - 20 results of 28 for search '"approximation theory"', query time: 0.07s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9

    An Overview of Some Issues in the Theory of Deep Networks by Poggio, Tomaso, Banburski, Andrzej

    Published 2021
    “…We review our contributions in the areas of approximation theory and optimization. We also introduce a new approach based on cross-validation leave-one-out stability to estimate bounds on the expected error of overparametrized classifiers, such as deep networks. © 2020 Institute of Electrical Engineers of Japan. …”
    Get full text
    Article
  10. 10

    Networks and the Best Approximation Property by Girosi, Federico, Poggio, Tomaso

    Published 2004
    “…From the point of view of approximation theory, however, the property of approximating continous functions arbitrarily well is not sufficient for characterizing good approximation schemes. …”
    Get full text
  11. 11

    On the Convergence of Stochastic Iterative Dynamic Programming Algorithms by Jaakkola, Tommi, Jordan, Michael I., Singh, Satinder P.

    Published 2004
    “…In this paper we provide a rigorous proof of convergence of these DP-based learning algorithms by relating them to the powerful techniques of stochastic approximation theory via a new convergence theorem. The theorem establishes a general class of convergent algorithms to which both TD(lambda) and Q-learning belong.…”
    Get full text
  12. 12

    Bounds on Urysohn width by Balitskiy, Alexey

    Published 2022
    “…This notion was introduced in the context of dimension theory, used in approximation theory, appeared in the work of Gromov on systolic geometry, and nowadays it is a metric invariant of independent interest. …”
    Get full text
    Get full text
    Thesis
  13. 13

    An Overview of Some Issues in the Theory of Deep Networks by Poggio, Tomaso, Banburski, Andrzej

    Published 2022
    “…We review our contributions in the areas of approximation theory and optimization. We also introduce a new approach based on cross-validation leave-one-out stability to estimate bounds on the expected error of overparametrized classifiers, such as deep networks. © 2020 Institute of Electrical Engineers of Japan. …”
    Get full text
    Article
  14. 14

    Hierarchically Local Tasks and Deep Convolutional Networks by Deza, Arturo, Liao, Qianli, Banburski, Andrzej, Poggio, Tomaso

    Published 2020
    “…Recent results in approximation theory have shown that there is an exponential advantage of deep convolutional-like networks in approximating functions with hierarchical locality in their compositional structure. …”
    Get full text
    Technical Report
  15. 15

    How Many Subpopulations Is Too Many? Exponential Lower Bounds for Inferring Population Histories by Kim, Younhun, Koehler, Frederic, Moitra, Ankur, Mossel, Elchanan, Ramnarayan, Govind

    Published 2021
    “…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure - the history of multiple subpopulations that merge, split, and change sizes over time. …”
    Get full text
    Article
  16. 16

    How Many Subpopulations Is Too Many? Exponential Lower Bounds for Inferring Population Histories by Kim, Younhun, Koehler, Frederic, Moitra, Ankur, Mossel, Elchanan, Ramnarayan, Govind

    Published 2022
    “…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure - the history of multiple subpopulations that merge, split, and change sizes over time. …”
    Get full text
    Article
  17. 17

    How Many Subpopulations Is Too Many?: Exponential Lower Bounds for Inferring Population Histories by Kim, Younhun, Koehler, Frederic, Moitra, Ankur, Mossel, Elchanan, Ramnarayan, Govind

    Published 2020
    “…Using a variety of tools from information theory, the theory of extremal polynomials, and approximation theory, we prove new sharp information-theoretic lower bounds on the problem of reconstructing population structure—the history of multiple subpopulations that merge, split and change sizes over time. …”
    Get full text
    Book
  18. 18

    Theoretical issues in deep networks by Poggio, Tomaso, Banburski, Andrzej, Liao, Qianli

    Published 2021
    “…We review our recent results toward this goal. In approximation theory both shallow and deep networks are known to approximate any continuous functions at an exponential cost. …”
    Get full text
    Article
  19. 19

    Theoretical Issues in Deep Networks by Poggio, Tomaso, Banburski, Andrzej, Liao, Qianli

    Published 2019
    “…We review our recent results towards this goal. In {\it approximation theory} both shallow and deep networks are known to approximate any continuous functions on a bounded domain at a cost which is exponential (the number of parameters is exponential in the dimensionality of the function). …”
    Get full text
    Technical Report
  20. 20

    Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations by Conrad, Patrick R., Marzouk, Youssef M., Pillai, Natesh S., Smith, Aaron

    Published 2015
    “…Our approach introduces local approximations of these models into the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. …”
    Get full text
    Get full text
    Article