Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion
Today, enhancement in sensing technology enables the use of multiple sensors to track human motion/activity precisely. Tracking human motion has various applications, such as fitness training, healthcare, rehabilitation, human-computer interaction, virtual reality, and activity recognition. Therefor...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-09-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/20/18/5342 |
_version_ | 1827705963918917632 |
---|---|
author | Ashok Kumar Patil Adithya Balasubramanyam Jae Yeong Ryu Pavan Kumar B N Bharatesh Chakravarthi Young Ho Chai |
author_facet | Ashok Kumar Patil Adithya Balasubramanyam Jae Yeong Ryu Pavan Kumar B N Bharatesh Chakravarthi Young Ho Chai |
author_sort | Ashok Kumar Patil |
collection | DOAJ |
description | Today, enhancement in sensing technology enables the use of multiple sensors to track human motion/activity precisely. Tracking human motion has various applications, such as fitness training, healthcare, rehabilitation, human-computer interaction, virtual reality, and activity recognition. Therefore, the fusion of multiple sensors creates new opportunities to develop and improve an existing system. This paper proposes a pose-tracking system by fusing multiple three-dimensional (3D) light detection and ranging (lidar) and inertial measurement unit (IMU) sensors. The initial step estimates the human skeletal parameters proportional to the target user’s height by extracting the point cloud from lidars. Next, IMUs are used to capture the orientation of each skeleton segment and estimate the respective joint positions. In the final stage, the displacement drift in the position is corrected by fusing the data from both sensors in real time. The installation setup is relatively effortless, flexible for sensor locations, and delivers results comparable to the state-of-the-art pose-tracking system. We evaluated the proposed system regarding its accuracy in the user’s height estimation, full-body joint position estimation, and reconstruction of the 3D avatar. We used a publicly available dataset for the experimental evaluation wherever possible. The results reveal that the accuracy of height and the position estimation is well within an acceptable range of ±3–5 cm. The reconstruction of the motion based on the publicly available dataset and our data is precise and realistic. |
first_indexed | 2024-03-10T16:14:52Z |
format | Article |
id | doaj.art-c034f64102014b1e81bc9357df7d9c74 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T16:14:52Z |
publishDate | 2020-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-c034f64102014b1e81bc9357df7d9c742023-11-20T14:11:35ZengMDPI AGSensors1424-82202020-09-012018534210.3390/s20185342Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human MotionAshok Kumar Patil0Adithya Balasubramanyam1Jae Yeong Ryu2Pavan Kumar B N3Bharatesh Chakravarthi4Young Ho Chai5Virtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaVirtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaVirtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaVirtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaVirtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaVirtual Environments Lab, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, KoreaToday, enhancement in sensing technology enables the use of multiple sensors to track human motion/activity precisely. Tracking human motion has various applications, such as fitness training, healthcare, rehabilitation, human-computer interaction, virtual reality, and activity recognition. Therefore, the fusion of multiple sensors creates new opportunities to develop and improve an existing system. This paper proposes a pose-tracking system by fusing multiple three-dimensional (3D) light detection and ranging (lidar) and inertial measurement unit (IMU) sensors. The initial step estimates the human skeletal parameters proportional to the target user’s height by extracting the point cloud from lidars. Next, IMUs are used to capture the orientation of each skeleton segment and estimate the respective joint positions. In the final stage, the displacement drift in the position is corrected by fusing the data from both sensors in real time. The installation setup is relatively effortless, flexible for sensor locations, and delivers results comparable to the state-of-the-art pose-tracking system. We evaluated the proposed system regarding its accuracy in the user’s height estimation, full-body joint position estimation, and reconstruction of the 3D avatar. We used a publicly available dataset for the experimental evaluation wherever possible. The results reveal that the accuracy of height and the position estimation is well within an acceptable range of ±3–5 cm. The reconstruction of the motion based on the publicly available dataset and our data is precise and realistic.https://www.mdpi.com/1424-8220/20/18/5342human motionactivity recognitionposition estimationlidarinertial sensormotion reconstruction |
spellingShingle | Ashok Kumar Patil Adithya Balasubramanyam Jae Yeong Ryu Pavan Kumar B N Bharatesh Chakravarthi Young Ho Chai Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion Sensors human motion activity recognition position estimation lidar inertial sensor motion reconstruction |
title | Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion |
title_full | Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion |
title_fullStr | Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion |
title_full_unstemmed | Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion |
title_short | Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion |
title_sort | fusion of multiple lidars and inertial sensors for the real time pose tracking of human motion |
topic | human motion activity recognition position estimation lidar inertial sensor motion reconstruction |
url | https://www.mdpi.com/1424-8220/20/18/5342 |
work_keys_str_mv | AT ashokkumarpatil fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion AT adithyabalasubramanyam fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion AT jaeyeongryu fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion AT pavankumarbn fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion AT bharateshchakravarthi fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion AT younghochai fusionofmultiplelidarsandinertialsensorsfortherealtimeposetrackingofhumanmotion |