Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding

Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MT...

Full description

Bibliographic Details
Main Authors: Zhibin Yu, Dennis S. Moirangthem, Minho Lee
Format: Article
Language:English
Published: Frontiers Media S.A. 2017-08-01
Series:Frontiers in Neurorobotics
Subjects:
Online Access:http://journal.frontiersin.org/article/10.3389/fnbot.2017.00042/full
_version_ 1819120970715103232
author Zhibin Yu
Dennis S. Moirangthem
Minho Lee
author_facet Zhibin Yu
Dennis S. Moirangthem
Minho Lee
author_sort Zhibin Yu
collection DOAJ
description Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MTRNN) model, which is believed to be a kind of solution, is a useful tool for recording and regenerating a continuous signal for dynamic tasks. However, the conventional MTRNN suffers from the vanishing gradient problem which renders it impossible to be used for longer sequence understanding. To address this problem, we propose a new model named Continuous Timescale Long-Short Term Memory (CTLSTM) in which we inherit the multiple timescales concept into the Long-Short Term Memory (LSTM) recurrent neural network (RNN) that addresses the vanishing gradient problem. We design an additional recurrent connection in the LSTM cell outputs to produce a time-delay in order to capture the slow context. Our experiments show that the proposed model exhibits better context modeling ability and captures the dynamic features on multiple large dataset classification tasks. The results illustrate that the multiple timescales concept enhances the ability of our model to handle longer sequences related with human intentions and hence proving to be more suitable for complex tasks, such as intention recognition.
first_indexed 2024-12-22T06:29:08Z
format Article
id doaj.art-11f0e79b851044eeb2907a2a0b277a47
institution Directory Open Access Journal
issn 1662-5218
language English
last_indexed 2024-12-22T06:29:08Z
publishDate 2017-08-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neurorobotics
spelling doaj.art-11f0e79b851044eeb2907a2a0b277a472022-12-21T18:35:45ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182017-08-011110.3389/fnbot.2017.00042280009Continuous Timescale Long-Short Term Memory Neural Network for Human Intent UnderstandingZhibin Yu0Dennis S. Moirangthem1Minho Lee2Department of Electrical Engineering, College of Information Science and Engineering, Ocean University of ChinaQingdao, ChinaSchool of Electronics Engineering, Kyungpook National UniversityDaegu, South KoreaSchool of Electronics Engineering, Kyungpook National UniversityDaegu, South KoreaUnderstanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MTRNN) model, which is believed to be a kind of solution, is a useful tool for recording and regenerating a continuous signal for dynamic tasks. However, the conventional MTRNN suffers from the vanishing gradient problem which renders it impossible to be used for longer sequence understanding. To address this problem, we propose a new model named Continuous Timescale Long-Short Term Memory (CTLSTM) in which we inherit the multiple timescales concept into the Long-Short Term Memory (LSTM) recurrent neural network (RNN) that addresses the vanishing gradient problem. We design an additional recurrent connection in the LSTM cell outputs to produce a time-delay in order to capture the slow context. Our experiments show that the proposed model exhibits better context modeling ability and captures the dynamic features on multiple large dataset classification tasks. The results illustrate that the multiple timescales concept enhances the ability of our model to handle longer sequences related with human intentions and hence proving to be more suitable for complex tasks, such as intention recognition.http://journal.frontiersin.org/article/10.3389/fnbot.2017.00042/fullcontinuous timescalerecurrent neural networkLSTMclassificationdynamic sequence
spellingShingle Zhibin Yu
Dennis S. Moirangthem
Minho Lee
Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
Frontiers in Neurorobotics
continuous timescale
recurrent neural network
LSTM
classification
dynamic sequence
title Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_full Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_fullStr Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_full_unstemmed Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_short Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_sort continuous timescale long short term memory neural network for human intent understanding
topic continuous timescale
recurrent neural network
LSTM
classification
dynamic sequence
url http://journal.frontiersin.org/article/10.3389/fnbot.2017.00042/full
work_keys_str_mv AT zhibinyu continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding
AT dennissmoirangthem continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding
AT minholee continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding