Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors
This study proposes a telemanipulation framework with two wearable IMU sensors without human skeletal kinematics. First, the states (intensity and direction) of spatial hand-guiding gestures are separately estimated through the proposed state estimator, and the states are also combined with the gest...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-08-01
|
Series: | Mathematics |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-7390/11/16/3514 |
_version_ | 1797583977325789184 |
---|---|
author | Haegyeom Choi Haneul Jeon Donghyeon Noh Taeho Kim Donghun Lee |
author_facet | Haegyeom Choi Haneul Jeon Donghyeon Noh Taeho Kim Donghun Lee |
author_sort | Haegyeom Choi |
collection | DOAJ |
description | This study proposes a telemanipulation framework with two wearable IMU sensors without human skeletal kinematics. First, the states (intensity and direction) of spatial hand-guiding gestures are separately estimated through the proposed state estimator, and the states are also combined with the gesture’s mode (linear, angular, and via) obtained with the bi-directional LSTM-based mode classifier. The spatial pose of the 6-DOF manipulator’s end-effector (EEF) can be controlled by combining the spatial linear and angular motions based on integrating the gesture’s mode and state. To validate the significance of the proposed method, the teleoperation of the EEF to the designated target poses was conducted in the motion-capture space. As a result, it was confirmed that the mode could be classified with 84.5% accuracy in real time, even during the operator’s dynamic movement; the direction could be estimated with an error of less than 1 degree; and the intensity could be successfully estimated with the gesture speed estimator and finely tuned with the scaling factor. Finally, it was confirmed that a subject could place the EEF within the average range of 83 mm and 2.56 degrees in the target pose with only less than ten consecutive hand-guiding gestures and visual inspection in the first trial. |
first_indexed | 2024-03-10T23:45:50Z |
format | Article |
id | doaj.art-a6ce3fc8d9484d0c8b6db3b68c820d5f |
institution | Directory Open Access Journal |
issn | 2227-7390 |
language | English |
last_indexed | 2024-03-10T23:45:50Z |
publishDate | 2023-08-01 |
publisher | MDPI AG |
record_format | Article |
series | Mathematics |
spelling | doaj.art-a6ce3fc8d9484d0c8b6db3b68c820d5f2023-11-19T02:03:07ZengMDPI AGMathematics2227-73902023-08-011116351410.3390/math11163514Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU SensorsHaegyeom Choi0Haneul Jeon1Donghyeon Noh2Taeho Kim3Donghun Lee4Mechanical Engineering Department, Soongsil University, Seoul 06978, Republic of KoreaMechanical Engineering Department, Soongsil University, Seoul 06978, Republic of KoreaMechanical Engineering Department, Soongsil University, Seoul 06978, Republic of KoreaMechanical Engineering Department, Soongsil University, Seoul 06978, Republic of KoreaMechanical Engineering Department, Soongsil University, Seoul 06978, Republic of KoreaThis study proposes a telemanipulation framework with two wearable IMU sensors without human skeletal kinematics. First, the states (intensity and direction) of spatial hand-guiding gestures are separately estimated through the proposed state estimator, and the states are also combined with the gesture’s mode (linear, angular, and via) obtained with the bi-directional LSTM-based mode classifier. The spatial pose of the 6-DOF manipulator’s end-effector (EEF) can be controlled by combining the spatial linear and angular motions based on integrating the gesture’s mode and state. To validate the significance of the proposed method, the teleoperation of the EEF to the designated target poses was conducted in the motion-capture space. As a result, it was confirmed that the mode could be classified with 84.5% accuracy in real time, even during the operator’s dynamic movement; the direction could be estimated with an error of less than 1 degree; and the intensity could be successfully estimated with the gesture speed estimator and finely tuned with the scaling factor. Finally, it was confirmed that a subject could place the EEF within the average range of 83 mm and 2.56 degrees in the target pose with only less than ten consecutive hand-guiding gestures and visual inspection in the first trial.https://www.mdpi.com/2227-7390/11/16/3514hand-guiding gesturegesture recognitiongesture state estimationreal-time remote controlbi-directional LSTMwearable sensor |
spellingShingle | Haegyeom Choi Haneul Jeon Donghyeon Noh Taeho Kim Donghun Lee Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors Mathematics hand-guiding gesture gesture recognition gesture state estimation real-time remote control bi-directional LSTM wearable sensor |
title | Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors |
title_full | Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors |
title_fullStr | Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors |
title_full_unstemmed | Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors |
title_short | Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors |
title_sort | hand guiding gesture based telemanipulation with the gesture mode classification and state estimation using wearable imu sensors |
topic | hand-guiding gesture gesture recognition gesture state estimation real-time remote control bi-directional LSTM wearable sensor |
url | https://www.mdpi.com/2227-7390/11/16/3514 |
work_keys_str_mv | AT haegyeomchoi handguidinggesturebasedtelemanipulationwiththegesturemodeclassificationandstateestimationusingwearableimusensors AT haneuljeon handguidinggesturebasedtelemanipulationwiththegesturemodeclassificationandstateestimationusingwearableimusensors AT donghyeonnoh handguidinggesturebasedtelemanipulationwiththegesturemodeclassificationandstateestimationusingwearableimusensors AT taehokim handguidinggesturebasedtelemanipulationwiththegesturemodeclassificationandstateestimationusingwearableimusensors AT donghunlee handguidinggesturebasedtelemanipulationwiththegesturemodeclassificationandstateestimationusingwearableimusensors |