Recognition of Upper Limb Action Intention Based on IMU
Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the up...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-03-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/22/5/1954 |
_version_ | 1797473699334455296 |
---|---|
author | Jian-Wei Cui Zhi-Gang Li Han Du Bing-Yan Yan Pu-Dong Lu |
author_facet | Jian-Wei Cui Zhi-Gang Li Han Du Bing-Yan Yan Pu-Dong Lu |
author_sort | Jian-Wei Cui |
collection | DOAJ |
description | Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment. |
first_indexed | 2024-03-09T20:20:10Z |
format | Article |
id | doaj.art-245eef58c189478cb57544ca2c7bac7d |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-09T20:20:10Z |
publishDate | 2022-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-245eef58c189478cb57544ca2c7bac7d2023-11-23T23:48:54ZengMDPI AGSensors1424-82202022-03-01225195410.3390/s22051954Recognition of Upper Limb Action Intention Based on IMUJian-Wei Cui0Zhi-Gang Li1Han Du2Bing-Yan Yan3Pu-Dong Lu4School of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaSchool of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaSchool of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaSchool of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaSchool of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaUsing motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment.https://www.mdpi.com/1424-8220/22/5/1954action intention recognition of upper limbprosthetic hand controlinertial sensordividing motion unit10-fold cross-validation |
spellingShingle | Jian-Wei Cui Zhi-Gang Li Han Du Bing-Yan Yan Pu-Dong Lu Recognition of Upper Limb Action Intention Based on IMU Sensors action intention recognition of upper limb prosthetic hand control inertial sensor dividing motion unit 10-fold cross-validation |
title | Recognition of Upper Limb Action Intention Based on IMU |
title_full | Recognition of Upper Limb Action Intention Based on IMU |
title_fullStr | Recognition of Upper Limb Action Intention Based on IMU |
title_full_unstemmed | Recognition of Upper Limb Action Intention Based on IMU |
title_short | Recognition of Upper Limb Action Intention Based on IMU |
title_sort | recognition of upper limb action intention based on imu |
topic | action intention recognition of upper limb prosthetic hand control inertial sensor dividing motion unit 10-fold cross-validation |
url | https://www.mdpi.com/1424-8220/22/5/1954 |
work_keys_str_mv | AT jianweicui recognitionofupperlimbactionintentionbasedonimu AT zhigangli recognitionofupperlimbactionintentionbasedonimu AT handu recognitionofupperlimbactionintentionbasedonimu AT bingyanyan recognitionofupperlimbactionintentionbasedonimu AT pudonglu recognitionofupperlimbactionintentionbasedonimu |