Universal approximation property of stochastic configuration networks for time series

Abstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is...

詳細記述

書誌詳細
主要な著者: Jin-Xi Zhang, Hangyi Zhao, Xuefeng Zhang
フォーマット: 論文
言語:English
出版事項: Springer 2024-03-01
シリーズ:Industrial Artificial Intelligence
主題:
オンライン・アクセス:https://doi.org/10.1007/s44244-024-00017-7
その他の書誌記述
要約:Abstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is constructed based on the random incremental algorithm of stochastic configuration networks. Leveraging the foundational structure of recurrent neural networks, our learning model commences with a modest-scale recurrent neural network featuring a single hidden layer and a solitary hidden node. Subsequently, the node parameters of the hidden layer undergo incremental augmentation through a random configuration process, with corresponding weights assigned structurally. This iterative expansion continues until the network satisfies predefined termination criteria. Noteworthy is the adaptability of this algorithm to handle time series data, exhibiting superior performance compared to traditional recurrent neural networks with similar architectures. The experimental results presented in this paper underscore the efficacy of the proposed RSCN for sequence data processing, showcasing its advantages over conventional recurrent neural networks in the context of the performed experiments.
ISSN:2731-667X