Universal approximation property of stochastic configuration networks for time series
Abstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2024-03-01
|
Series: | Industrial Artificial Intelligence |
Subjects: | |
Online Access: | https://doi.org/10.1007/s44244-024-00017-7 |
_version_ | 1797233742637432832 |
---|---|
author | Jin-Xi Zhang Hangyi Zhao Xuefeng Zhang |
author_facet | Jin-Xi Zhang Hangyi Zhao Xuefeng Zhang |
author_sort | Jin-Xi Zhang |
collection | DOAJ |
description | Abstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is constructed based on the random incremental algorithm of stochastic configuration networks. Leveraging the foundational structure of recurrent neural networks, our learning model commences with a modest-scale recurrent neural network featuring a single hidden layer and a solitary hidden node. Subsequently, the node parameters of the hidden layer undergo incremental augmentation through a random configuration process, with corresponding weights assigned structurally. This iterative expansion continues until the network satisfies predefined termination criteria. Noteworthy is the adaptability of this algorithm to handle time series data, exhibiting superior performance compared to traditional recurrent neural networks with similar architectures. The experimental results presented in this paper underscore the efficacy of the proposed RSCN for sequence data processing, showcasing its advantages over conventional recurrent neural networks in the context of the performed experiments. |
first_indexed | 2024-04-24T16:21:00Z |
format | Article |
id | doaj.art-538b51ea28414ece8266e6039fc15a6b |
institution | Directory Open Access Journal |
issn | 2731-667X |
language | English |
last_indexed | 2024-04-24T16:21:00Z |
publishDate | 2024-03-01 |
publisher | Springer |
record_format | Article |
series | Industrial Artificial Intelligence |
spelling | doaj.art-538b51ea28414ece8266e6039fc15a6b2024-03-31T11:12:03ZengSpringerIndustrial Artificial Intelligence2731-667X2024-03-012111710.1007/s44244-024-00017-7Universal approximation property of stochastic configuration networks for time seriesJin-Xi Zhang0Hangyi Zhao1Xuefeng Zhang2State Key Laboratory of Synthetical Automation for Process Industries, Northeastern UniversityCollege of Sciences, Northeastern UniversityCollege of Sciences, Northeastern UniversityAbstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is constructed based on the random incremental algorithm of stochastic configuration networks. Leveraging the foundational structure of recurrent neural networks, our learning model commences with a modest-scale recurrent neural network featuring a single hidden layer and a solitary hidden node. Subsequently, the node parameters of the hidden layer undergo incremental augmentation through a random configuration process, with corresponding weights assigned structurally. This iterative expansion continues until the network satisfies predefined termination criteria. Noteworthy is the adaptability of this algorithm to handle time series data, exhibiting superior performance compared to traditional recurrent neural networks with similar architectures. The experimental results presented in this paper underscore the efficacy of the proposed RSCN for sequence data processing, showcasing its advantages over conventional recurrent neural networks in the context of the performed experiments.https://doi.org/10.1007/s44244-024-00017-7Recurrent neural networksStochastic configuration networksIncremental learningTime seriesDeep learning |
spellingShingle | Jin-Xi Zhang Hangyi Zhao Xuefeng Zhang Universal approximation property of stochastic configuration networks for time series Industrial Artificial Intelligence Recurrent neural networks Stochastic configuration networks Incremental learning Time series Deep learning |
title | Universal approximation property of stochastic configuration networks for time series |
title_full | Universal approximation property of stochastic configuration networks for time series |
title_fullStr | Universal approximation property of stochastic configuration networks for time series |
title_full_unstemmed | Universal approximation property of stochastic configuration networks for time series |
title_short | Universal approximation property of stochastic configuration networks for time series |
title_sort | universal approximation property of stochastic configuration networks for time series |
topic | Recurrent neural networks Stochastic configuration networks Incremental learning Time series Deep learning |
url | https://doi.org/10.1007/s44244-024-00017-7 |
work_keys_str_mv | AT jinxizhang universalapproximationpropertyofstochasticconfigurationnetworksfortimeseries AT hangyizhao universalapproximationpropertyofstochasticconfigurationnetworksfortimeseries AT xuefengzhang universalapproximationpropertyofstochasticconfigurationnetworksfortimeseries |