Dop-DenseNet: Densely Convolutional Neural Network-Based Gesture Recognition Using a Micro-Doppler Radar

Hand gesture recognition is an efficient and practical solution for the non-contact human–machine interaction in smart devices. To date, vision-based methods are widely used in this research area, but they are susceptible to light conditions. To address this issue, radar-based gesture recognition us...

Full description

Bibliographic Details
Main Authors: Hai Le, Van-Phuc Hoang, Van Sang Doan, Dai Phong Le
Format: Article
Language:English
Published: The Korean Institute of Electromagnetic Engineering and Science 2022-05-01
Series:Journal of Electromagnetic Engineering and Science
Subjects:
Online Access:http://jees.kr/upload/pdf/jees-2022-3-r-95.pdf
Description
Summary:Hand gesture recognition is an efficient and practical solution for the non-contact human–machine interaction in smart devices. To date, vision-based methods are widely used in this research area, but they are susceptible to light conditions. To address this issue, radar-based gesture recognition using micro-Doppler signatures can be applied as an alternative. Accordingly, the use of a novel densely convolutional neural network model, Dop-DenseNet, is proposed in this paper for improving hand gesture recognition in terms of classification accuracy and latency. The model was designed with cross or skip connections in a dense architecture so that the former features, which can be lost in the forward-propagation process, can be reused. We evaluated our model with different numbers of filter channels and experimented with it using the Dop-Net dataset, with different time lengths of input data. As a result, it was found that the model with 64 3 × 3 filters and 200 time bins of micro-Doppler spectrogram data could achieve the best performance trade-off, with 99.87% classification accuracy and 3.1 ms latency. In comparison, our model remarkably outperformed the selected state-of-the-art neural networks (GoogLeNet, Res-Net-50, NasNet-Mobile, and MobileNet-V2) using the same Dop-Net dataset.
ISSN:2671-7255
2671-7263