Recurrent online kernel recursive least square algorithm for nonlinear modeling

In this paper, we proposed a recurrent kernel recursive least square (RLS) algorithm for online learning. In classical kernel methods, the kernel function number grows as the number of training sample increases, which makes the computational cost of the algorithm very high and only applicable for of...

Full description

Bibliographic Details
Main Authors: Fan, Haijin, Song, Qing, Xu, Zhao
Other Authors: School of Electrical and Electronic Engineering
Format: Conference Paper
Language:English
Published: 2013
Subjects:
Online Access:https://hdl.handle.net/10356/101013
http://hdl.handle.net/10220/16315
_version_ 1811695534327463936
author Fan, Haijin
Song, Qing
Xu, Zhao
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Fan, Haijin
Song, Qing
Xu, Zhao
author_sort Fan, Haijin
collection NTU
description In this paper, we proposed a recurrent kernel recursive least square (RLS) algorithm for online learning. In classical kernel methods, the kernel function number grows as the number of training sample increases, which makes the computational cost of the algorithm very high and only applicable for offline learning. In order to make the kernel methods suitable for online learning where the system is updated when a new training sample is obtained, a compact dictionary (support vectors set) should be chosen to represent the whole training data, which in turn reduces the number of kernel functions. For this purpose, a sparsification method based on the Hessian matrix of the loss function is applied to continuously examine the importance of the new training sample and determine the update of the dictionary according to the importance measure. We show that the Hessian matrix is equivalent to the correlation matrix of the training samples in the RLS algorithm. This makes the sparsification method able to be easily incorporated into the RLS algorithm and reduce the computational cost futher. Simulation results show that our algorithm is an effective learning method for online chaotic signal prediction and nonlinear system identification.
first_indexed 2024-10-01T07:25:00Z
format Conference Paper
id ntu-10356/101013
institution Nanyang Technological University
language English
last_indexed 2024-10-01T07:25:00Z
publishDate 2013
record_format dspace
spelling ntu-10356/1010132020-03-07T13:24:50Z Recurrent online kernel recursive least square algorithm for nonlinear modeling Fan, Haijin Song, Qing Xu, Zhao School of Electrical and Electronic Engineering Annual Conference on IEEE Industrial Electronics Society (38th : 2012 : Montreal, Canada) DRNTU::Engineering::Electrical and electronic engineering In this paper, we proposed a recurrent kernel recursive least square (RLS) algorithm for online learning. In classical kernel methods, the kernel function number grows as the number of training sample increases, which makes the computational cost of the algorithm very high and only applicable for offline learning. In order to make the kernel methods suitable for online learning where the system is updated when a new training sample is obtained, a compact dictionary (support vectors set) should be chosen to represent the whole training data, which in turn reduces the number of kernel functions. For this purpose, a sparsification method based on the Hessian matrix of the loss function is applied to continuously examine the importance of the new training sample and determine the update of the dictionary according to the importance measure. We show that the Hessian matrix is equivalent to the correlation matrix of the training samples in the RLS algorithm. This makes the sparsification method able to be easily incorporated into the RLS algorithm and reduce the computational cost futher. Simulation results show that our algorithm is an effective learning method for online chaotic signal prediction and nonlinear system identification. 2013-10-10T01:28:56Z 2019-12-06T20:31:59Z 2013-10-10T01:28:56Z 2019-12-06T20:31:59Z 2012 2012 Conference Paper Fan, H., Song, Q., & Xu, Z. (2012). Recurrent online kernel recursive least square algorithm for nonlinear modeling. IECON 2012 - 38th Annual Conference on IEEE Industrial Electronics Society, pp.1574-1579. https://hdl.handle.net/10356/101013 http://hdl.handle.net/10220/16315 10.1109/IECON.2012.6388534 en
spellingShingle DRNTU::Engineering::Electrical and electronic engineering
Fan, Haijin
Song, Qing
Xu, Zhao
Recurrent online kernel recursive least square algorithm for nonlinear modeling
title Recurrent online kernel recursive least square algorithm for nonlinear modeling
title_full Recurrent online kernel recursive least square algorithm for nonlinear modeling
title_fullStr Recurrent online kernel recursive least square algorithm for nonlinear modeling
title_full_unstemmed Recurrent online kernel recursive least square algorithm for nonlinear modeling
title_short Recurrent online kernel recursive least square algorithm for nonlinear modeling
title_sort recurrent online kernel recursive least square algorithm for nonlinear modeling
topic DRNTU::Engineering::Electrical and electronic engineering
url https://hdl.handle.net/10356/101013
http://hdl.handle.net/10220/16315
work_keys_str_mv AT fanhaijin recurrentonlinekernelrecursiveleastsquarealgorithmfornonlinearmodeling
AT songqing recurrentonlinekernelrecursiveleastsquarealgorithmfornonlinearmodeling
AT xuzhao recurrentonlinekernelrecursiveleastsquarealgorithmfornonlinearmodeling