Optimizing Echo State Networks for Enhancing Large Prediction Horizons of Chaotic Time Series

Reservoir computing has shown promising results in predicting chaotic time series. However, the main challenges of time-series predictions are associated with reducing computational costs and increasing the prediction horizon. In this sense, we propose the optimization of Echo State Networks (ESN),...

Full description

Bibliographic Details
Main Authors: Astrid Maritza González-Zapata, Esteban Tlelo-Cuautle, Brisbane Ovilla-Martinez, Israel Cruz-Vega, Luis Gerardo De la Fraga
Format: Article
Language:English
Published: MDPI AG 2022-10-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/20/3886
Description
Summary:Reservoir computing has shown promising results in predicting chaotic time series. However, the main challenges of time-series predictions are associated with reducing computational costs and increasing the prediction horizon. In this sense, we propose the optimization of Echo State Networks (ESN), where the main goal is to increase the prediction horizon using a lower count number of neurons compared with state-of-the-art models. In addition, we show that the application of the decimation technique allows us to emulate an increase in the prediction of up to 10,000 steps ahead. The optimization is performed by applying particle swarm optimization and considering two chaotic systems as case studies, namely the chaotic Hindmarsh–Rose neuron with slow dynamic behavior and the well-known Lorenz system. The results show that although similar works used from 200 to 5000 neurons in the reservoir of the ESN to predict from 120 to 700 steps ahead, our optimized ESN including decimation used 100 neurons in the reservoir, with a capability of predicting up to 10,000 steps ahead. The main conclusion is that we ensured larger prediction horizons compared to recent works, achieving an improvement of more than one order of magnitude, and the computational costs were greatly reduced.
ISSN:2227-7390