Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.

As a novel form of human machine interaction (HMI), hand gesture recognition (HGR) has garnered extensive attention and research. The majority of HGR studies are based on visual systems, inevitably encountering challenges such as depth and occlusion. On the contrary, data gloves can facilitate data...

Full description

Bibliographic Details
Main Authors: Jiawei Wu, Peng Ren, Boming Song, Ran Zhang, Chen Zhao, Xiao Zhang
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2023-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0294174
_version_ 1797394645775286272
author Jiawei Wu
Peng Ren
Boming Song
Ran Zhang
Chen Zhao
Xiao Zhang
author_facet Jiawei Wu
Peng Ren
Boming Song
Ran Zhang
Chen Zhao
Xiao Zhang
author_sort Jiawei Wu
collection DOAJ
description As a novel form of human machine interaction (HMI), hand gesture recognition (HGR) has garnered extensive attention and research. The majority of HGR studies are based on visual systems, inevitably encountering challenges such as depth and occlusion. On the contrary, data gloves can facilitate data collection with minimal interference in complex environments, thus becoming a research focus in fields such as medical simulation and virtual reality. To explore the application of data gloves in dynamic gesture recognition, this paper proposes a data glove-based dynamic gesture recognition model called the Attention-based CNN-BiLSTM Network (A-CBLN). In A-CBLN, the convolutional neural network (CNN) is employed to capture local features, while the bidirectional long short-term memory (BiLSTM) is used to extract contextual temporal features of gesture data. By utilizing attention mechanisms to allocate weights to gesture features, the model enhances its understanding of different gesture meanings, thereby improving recognition accuracy. We selected seven dynamic gestures as research targets and recruited 32 subjects for participation. Experimental results demonstrate that A-CBLN effectively addresses the challenge of dynamic gesture recognition, outperforming existing models and achieving optimal gesture recognition performance, with the accuracy of 95.05% and precision of 95.43% on the test dataset.
first_indexed 2024-03-09T00:22:46Z
format Article
id doaj.art-db26db8483734712baf73436c74409fa
institution Directory Open Access Journal
issn 1932-6203
language English
last_indexed 2024-03-09T00:22:46Z
publishDate 2023-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj.art-db26db8483734712baf73436c74409fa2023-12-12T05:33:50ZengPublic Library of Science (PLoS)PLoS ONE1932-62032023-01-011811e029417410.1371/journal.pone.0294174Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.Jiawei WuPeng RenBoming SongRan ZhangChen ZhaoXiao ZhangAs a novel form of human machine interaction (HMI), hand gesture recognition (HGR) has garnered extensive attention and research. The majority of HGR studies are based on visual systems, inevitably encountering challenges such as depth and occlusion. On the contrary, data gloves can facilitate data collection with minimal interference in complex environments, thus becoming a research focus in fields such as medical simulation and virtual reality. To explore the application of data gloves in dynamic gesture recognition, this paper proposes a data glove-based dynamic gesture recognition model called the Attention-based CNN-BiLSTM Network (A-CBLN). In A-CBLN, the convolutional neural network (CNN) is employed to capture local features, while the bidirectional long short-term memory (BiLSTM) is used to extract contextual temporal features of gesture data. By utilizing attention mechanisms to allocate weights to gesture features, the model enhances its understanding of different gesture meanings, thereby improving recognition accuracy. We selected seven dynamic gestures as research targets and recruited 32 subjects for participation. Experimental results demonstrate that A-CBLN effectively addresses the challenge of dynamic gesture recognition, outperforming existing models and achieving optimal gesture recognition performance, with the accuracy of 95.05% and precision of 95.43% on the test dataset.https://doi.org/10.1371/journal.pone.0294174
spellingShingle Jiawei Wu
Peng Ren
Boming Song
Ran Zhang
Chen Zhao
Xiao Zhang
Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
PLoS ONE
title Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
title_full Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
title_fullStr Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
title_full_unstemmed Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
title_short Data glove-based gesture recognition using CNN-BiLSTM model with attention mechanism.
title_sort data glove based gesture recognition using cnn bilstm model with attention mechanism
url https://doi.org/10.1371/journal.pone.0294174
work_keys_str_mv AT jiaweiwu dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism
AT pengren dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism
AT bomingsong dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism
AT ranzhang dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism
AT chenzhao dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism
AT xiaozhang dataglovebasedgesturerecognitionusingcnnbilstmmodelwithattentionmechanism