Self-consistent learning of neural dynamical systems from noisy time series

We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical...

Full description

Bibliographic Details
Main Authors: Wang, Zhe, Guet, Claude
Other Authors: School of Physical and Mathematical Sciences
Format: Journal Article
Language:English
Published: 2022
Subjects:
Online Access:https://hdl.handle.net/10356/162829
_version_ 1824456411001651200
author Wang, Zhe
Guet, Claude
author2 School of Physical and Mathematical Sciences
author_facet School of Physical and Mathematical Sciences
Wang, Zhe
Guet, Claude
author_sort Wang, Zhe
collection NTU
description We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical system using deep neural networks. Under the assumption that the trajectories follow the latent dynamical system, we determine the unknowns of the dynamical system, and filter out stochastic outliers in the measurements. In this sense the method is self-consistent. The embedding dimension is determined iteratively during the training by using the false-nearest-neighbors Algorithm and it is implemented as an attention map to the state vector. This allows for a state space reconstruction without a priori information on the signal. By exploiting the differentiability of the neural solution trajectory, we can define the neural dynamical system locally at each time, mitigating the need for forward and backwards passing through numerical solvers of the canonical adjoint method. On a chaotic time series masked by additive Gaussian noise, we demonstrate that the denoising ability and the predictive power of the proposed method are mainly due to the self-consistency, insensitive to methods used for the state space reconstruction.
first_indexed 2025-02-19T03:53:40Z
format Journal Article
id ntu-10356/162829
institution Nanyang Technological University
language English
last_indexed 2025-02-19T03:53:40Z
publishDate 2022
record_format dspace
spelling ntu-10356/1628292022-11-10T08:17:10Z Self-consistent learning of neural dynamical systems from noisy time series Wang, Zhe Guet, Claude School of Physical and Mathematical Sciences Energy Research Institute @ NTU (ERI@N) Engineering::Computer science and engineering Time Series Analysis Dynamical Systems We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical system using deep neural networks. Under the assumption that the trajectories follow the latent dynamical system, we determine the unknowns of the dynamical system, and filter out stochastic outliers in the measurements. In this sense the method is self-consistent. The embedding dimension is determined iteratively during the training by using the false-nearest-neighbors Algorithm and it is implemented as an attention map to the state vector. This allows for a state space reconstruction without a priori information on the signal. By exploiting the differentiability of the neural solution trajectory, we can define the neural dynamical system locally at each time, mitigating the need for forward and backwards passing through numerical solvers of the canonical adjoint method. On a chaotic time series masked by additive Gaussian noise, we demonstrate that the denoising ability and the predictive power of the proposed method are mainly due to the self-consistency, insensitive to methods used for the state space reconstruction. Nanyang Technological University National Research Foundation (NRF) The work of Zhe Wang was supported by Energy Research Institute@NTU, Nanyang Technological University, where most of this work was performed. This work was supported by the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme. 2022-11-10T08:17:10Z 2022-11-10T08:17:10Z 2022 Journal Article Wang, Z. & Guet, C. (2022). Self-consistent learning of neural dynamical systems from noisy time series. IEEE Transactions On Emerging Topics in Computational Intelligence, 6(5), 1103-1112. https://dx.doi.org/10.1109/TETCI.2022.3146332 2471-285X https://hdl.handle.net/10356/162829 10.1109/TETCI.2022.3146332 2-s2.0-85124820022 5 6 1103 1112 en IEEE Transactions on Emerging Topics in Computational Intelligence © 2022 IEEE. All rights reserved.
spellingShingle Engineering::Computer science and engineering
Time Series Analysis
Dynamical Systems
Wang, Zhe
Guet, Claude
Self-consistent learning of neural dynamical systems from noisy time series
title Self-consistent learning of neural dynamical systems from noisy time series
title_full Self-consistent learning of neural dynamical systems from noisy time series
title_fullStr Self-consistent learning of neural dynamical systems from noisy time series
title_full_unstemmed Self-consistent learning of neural dynamical systems from noisy time series
title_short Self-consistent learning of neural dynamical systems from noisy time series
title_sort self consistent learning of neural dynamical systems from noisy time series
topic Engineering::Computer science and engineering
Time Series Analysis
Dynamical Systems
url https://hdl.handle.net/10356/162829
work_keys_str_mv AT wangzhe selfconsistentlearningofneuraldynamicalsystemsfromnoisytimeseries
AT guetclaude selfconsistentlearningofneuraldynamicalsystemsfromnoisytimeseries