On neural differential equations

<p>The conjoining of dynamical systems and deep learning has become a topic of great interest. In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equation are two sides of the same coin. Traditional parameterised differential equations are a s...

Popoln opis

Bibliografske podrobnosti
Glavni avtor: Kidger, P
Drugi avtorji: Lyons, T
Format: Thesis
Jezik:English
Izdano: 2021
Teme:
_version_ 1826291188630552576
author Kidger, P
author2 Lyons, T
author_facet Lyons, T
Kidger, P
author_sort Kidger, P
collection OXFORD
description <p>The conjoining of dynamical systems and deep learning has become a topic of great interest. In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equation are two sides of the same coin. Traditional parameterised differential equations are a special case. Many popular neural network architectures, such as residual networks and recurrent networks, are discretisations.</p> <p>NDEs are suitable for tackling generative problems, dynamical systems, and time series (particularly in physics, finance, ...) and are thus of interest to both modern machine learning and traditional mathematical modelling. NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.</p> <p>This doctoral thesis provides an in-depth survey of the field.</p> <p>Topics include: neural ordinary differential equations (e.g. for hybrid neural/mechanistic modelling of physical systems); neural controlled differential equations (e.g. for learning functions of irregular time series); and neural stochastic differential equations (e.g. to produce generative models capable of representing complex stochastic dynamics, or sampling from complex high-dimensional distributions).</p> <p>Further topics include: numerical methods for NDEs (e.g. reversible differential equations solvers, backpropagation through differential equations, Brownian reconstruction); symbolic regression for dynamical systems (e.g. via regularised evolution); and deep implicit models (e.g. deep equilibrium models, differentiable optimisation).</p> <p>We anticipate this thesis will be of interest to anyone interested in the marriage of deep learning with dynamical systems, and hope it will provide a useful reference for the current state of the art.</p>
first_indexed 2024-03-07T02:55:40Z
format Thesis
id oxford-uuid:af32d844-df84-4fdc-824d-44bebc3d7aa9
institution University of Oxford
language English
last_indexed 2024-03-07T02:55:40Z
publishDate 2021
record_format dspace
spelling oxford-uuid:af32d844-df84-4fdc-824d-44bebc3d7aa92022-03-27T03:47:59ZOn neural differential equationsThesishttp://purl.org/coar/resource_type/c_db06uuid:af32d844-df84-4fdc-824d-44bebc3d7aa9Machine learningDeep learning (Machine learning)Stochastic differential equationsDifferential equationsEnglishHyrax Deposit2021Kidger, PLyons, T<p>The conjoining of dynamical systems and deep learning has become a topic of great interest. In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equation are two sides of the same coin. Traditional parameterised differential equations are a special case. Many popular neural network architectures, such as residual networks and recurrent networks, are discretisations.</p> <p>NDEs are suitable for tackling generative problems, dynamical systems, and time series (particularly in physics, finance, ...) and are thus of interest to both modern machine learning and traditional mathematical modelling. NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.</p> <p>This doctoral thesis provides an in-depth survey of the field.</p> <p>Topics include: neural ordinary differential equations (e.g. for hybrid neural/mechanistic modelling of physical systems); neural controlled differential equations (e.g. for learning functions of irregular time series); and neural stochastic differential equations (e.g. to produce generative models capable of representing complex stochastic dynamics, or sampling from complex high-dimensional distributions).</p> <p>Further topics include: numerical methods for NDEs (e.g. reversible differential equations solvers, backpropagation through differential equations, Brownian reconstruction); symbolic regression for dynamical systems (e.g. via regularised evolution); and deep implicit models (e.g. deep equilibrium models, differentiable optimisation).</p> <p>We anticipate this thesis will be of interest to anyone interested in the marriage of deep learning with dynamical systems, and hope it will provide a useful reference for the current state of the art.</p>
spellingShingle Machine learning
Deep learning (Machine learning)
Stochastic differential equations
Differential equations
Kidger, P
On neural differential equations
title On neural differential equations
title_full On neural differential equations
title_fullStr On neural differential equations
title_full_unstemmed On neural differential equations
title_short On neural differential equations
title_sort on neural differential equations
topic Machine learning
Deep learning (Machine learning)
Stochastic differential equations
Differential equations
work_keys_str_mv AT kidgerp onneuraldifferentialequations