Learning inertial odometry for dynamic legged robot state estimation

This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displaceme...

Full description

Bibliographic Details
Main Authors: Buchanan, R, Camurri, M, Dellaert, F, Fallon, M
Format: Conference item
Language:English
Published: Journal of Machine Learning Research 2022
_version_ 1826275630974500864
author Buchanan, R
Camurri, M
Dellaert, F
Fallon, M
author_facet Buchanan, R
Camurri, M
Dellaert, F
Fallon, M
author_sort Buchanan, R
collection OXFORD
description This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.
first_indexed 2024-03-06T23:01:44Z
format Conference item
id oxford-uuid:625a6aa9-b992-4e78-938b-8a3db5f90135
institution University of Oxford
language English
last_indexed 2024-03-06T23:01:44Z
publishDate 2022
publisher Journal of Machine Learning Research
record_format dspace
spelling oxford-uuid:625a6aa9-b992-4e78-938b-8a3db5f901352022-03-26T18:05:47ZLearning inertial odometry for dynamic legged robot state estimationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:625a6aa9-b992-4e78-938b-8a3db5f90135EnglishSymplectic ElementsJournal of Machine Learning Research2022Buchanan, RCamurri, MDellaert, FFallon, MThis paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.
spellingShingle Buchanan, R
Camurri, M
Dellaert, F
Fallon, M
Learning inertial odometry for dynamic legged robot state estimation
title Learning inertial odometry for dynamic legged robot state estimation
title_full Learning inertial odometry for dynamic legged robot state estimation
title_fullStr Learning inertial odometry for dynamic legged robot state estimation
title_full_unstemmed Learning inertial odometry for dynamic legged robot state estimation
title_short Learning inertial odometry for dynamic legged robot state estimation
title_sort learning inertial odometry for dynamic legged robot state estimation
work_keys_str_mv AT buchananr learninginertialodometryfordynamicleggedrobotstateestimation
AT camurrim learninginertialodometryfordynamicleggedrobotstateestimation
AT dellaertf learninginertialodometryfordynamicleggedrobotstateestimation
AT fallonm learninginertialodometryfordynamicleggedrobotstateestimation