A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor
In this study, we introduce a novel framework that combines human motion parameterization from a single inertial sensor, motion synthesis from these parameters, and biped robot motion control using the synthesized motion. This framework applies advanced deep learning methods to data obtained from an...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-12-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/23/24/9841 |
_version_ | 1797379354604339200 |
---|---|
author | Tsige Tadesse Alemayoh Jae Hoon Lee Shingo Okamoto |
author_facet | Tsige Tadesse Alemayoh Jae Hoon Lee Shingo Okamoto |
author_sort | Tsige Tadesse Alemayoh |
collection | DOAJ |
description | In this study, we introduce a novel framework that combines human motion parameterization from a single inertial sensor, motion synthesis from these parameters, and biped robot motion control using the synthesized motion. This framework applies advanced deep learning methods to data obtained from an IMU attached to a human subject’s pelvis. This minimalistic sensor setup simplifies the data collection process, overcoming price and complexity challenges related to multi-sensor systems. We employed a Bi-LSTM encoder to estimate key human motion parameters: walking velocity and gait phase from the IMU sensor. This step is followed by a feedforward motion generator-decoder network that accurately produces lower limb joint angles and displacement corresponding to these parameters. Additionally, our method also introduces a Fourier series-based approach to generate these key motion parameters solely from user commands, specifically walking speed and gait period. Hence, the decoder can receive inputs either from the encoder or directly from the Fourier series parameter generator. The output of the decoder network is then utilized as a reference motion for the walking control of a biped robot, employing a constraint-consistent inverse dynamics control algorithm. This framework facilitates biped robot motion planning based on data from either a single inertial sensor or two user commands. The proposed method was validated through robot simulations in the MuJoco physics engine environment. The motion controller achieved an error of ≤5° in tracking the joint angles demonstrating the effectiveness of the proposed framework. This was accomplished using minimal sensor data or few user commands, marking a promising foundation for robotic control and human–robot interaction. |
first_indexed | 2024-03-08T20:22:58Z |
format | Article |
id | doaj.art-58732c7e677d46ef83b9cd9a63c63f4d |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-08T20:22:58Z |
publishDate | 2023-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-58732c7e677d46ef83b9cd9a63c63f4d2023-12-22T14:41:01ZengMDPI AGSensors1424-82202023-12-012324984110.3390/s23249841A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial SensorTsige Tadesse Alemayoh0Jae Hoon Lee1Shingo Okamoto2Department of Mechanical Engineering, Graduate School of Science and Engineering, Ehime University, Bunkyo-cho 3, Matsuyama 790-8577, Ehime, JapanDepartment of Mechanical Engineering, Graduate School of Science and Engineering, Ehime University, Bunkyo-cho 3, Matsuyama 790-8577, Ehime, JapanDepartment of Mechanical Engineering, Graduate School of Science and Engineering, Ehime University, Bunkyo-cho 3, Matsuyama 790-8577, Ehime, JapanIn this study, we introduce a novel framework that combines human motion parameterization from a single inertial sensor, motion synthesis from these parameters, and biped robot motion control using the synthesized motion. This framework applies advanced deep learning methods to data obtained from an IMU attached to a human subject’s pelvis. This minimalistic sensor setup simplifies the data collection process, overcoming price and complexity challenges related to multi-sensor systems. We employed a Bi-LSTM encoder to estimate key human motion parameters: walking velocity and gait phase from the IMU sensor. This step is followed by a feedforward motion generator-decoder network that accurately produces lower limb joint angles and displacement corresponding to these parameters. Additionally, our method also introduces a Fourier series-based approach to generate these key motion parameters solely from user commands, specifically walking speed and gait period. Hence, the decoder can receive inputs either from the encoder or directly from the Fourier series parameter generator. The output of the decoder network is then utilized as a reference motion for the walking control of a biped robot, employing a constraint-consistent inverse dynamics control algorithm. This framework facilitates biped robot motion planning based on data from either a single inertial sensor or two user commands. The proposed method was validated through robot simulations in the MuJoco physics engine environment. The motion controller achieved an error of ≤5° in tracking the joint angles demonstrating the effectiveness of the proposed framework. This was accomplished using minimal sensor data or few user commands, marking a promising foundation for robotic control and human–robot interaction.https://www.mdpi.com/1424-8220/23/24/9841motion synthesisdeep learningwalking controllerinertial sensor |
spellingShingle | Tsige Tadesse Alemayoh Jae Hoon Lee Shingo Okamoto A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor Sensors motion synthesis deep learning walking controller inertial sensor |
title | A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor |
title_full | A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor |
title_fullStr | A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor |
title_full_unstemmed | A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor |
title_short | A Deep Learning Approach for Biped Robot Locomotion Interface Using a Single Inertial Sensor |
title_sort | deep learning approach for biped robot locomotion interface using a single inertial sensor |
topic | motion synthesis deep learning walking controller inertial sensor |
url | https://www.mdpi.com/1424-8220/23/24/9841 |
work_keys_str_mv | AT tsigetadessealemayoh adeeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor AT jaehoonlee adeeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor AT shingookamoto adeeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor AT tsigetadessealemayoh deeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor AT jaehoonlee deeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor AT shingookamoto deeplearningapproachforbipedrobotlocomotioninterfaceusingasingleinertialsensor |