A Visual-Aided Inertial Navigation and Mapping System
State estimation is a fundamental necessity for any application involving autonomous robots. This paper describes a visual-aided inertial navigation and mapping system for application to autonomous robots. The system, which relies on Kalman filtering, is designed to fuse the measurements obtained fr...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
SAGE Publishing
2016-05-01
|
Series: | International Journal of Advanced Robotic Systems |
Online Access: | https://doi.org/10.5772/64011 |
_version_ | 1828834820396941312 |
---|---|
author | Rodrigo Munguía Emmanuel Nuño Carlos I. Aldana Sarquis Urzua |
author_facet | Rodrigo Munguía Emmanuel Nuño Carlos I. Aldana Sarquis Urzua |
author_sort | Rodrigo Munguía |
collection | DOAJ |
description | State estimation is a fundamental necessity for any application involving autonomous robots. This paper describes a visual-aided inertial navigation and mapping system for application to autonomous robots. The system, which relies on Kalman filtering, is designed to fuse the measurements obtained from a monocular camera, an inertial measurement unit (IMU) and a position sensor (GPS). The estimated state consists of the full state of the vehicle: the position, orientation, their first derivatives and the parameter errors of the inertial sensors (i.e., the bias of gyroscopes and accelerometers). The system also provides the spatial locations of the visual features observed by the camera. The proposed scheme was designed by considering the limited resources commonly available in small mobile robots, while it is intended to be applied to cluttered environments in order to perform fully vision-based navigation in periods where the position sensor is not available. Moreover, the estimated map of visual features would be suitable for multiple tasks: i) terrain analysis; ii) three-dimensional (3D) scene reconstruction; iii) localization, detection or perception of obstacles and generating trajectories to navigate around these obstacles; and iv) autonomous exploration. In this work, simulations and experiments with real data are presented in order to validate and demonstrate the performance of the proposal. |
first_indexed | 2024-12-12T17:47:09Z |
format | Article |
id | doaj.art-b8d2e3c773274d9598db41b928cd571b |
institution | Directory Open Access Journal |
issn | 1729-8814 |
language | English |
last_indexed | 2024-12-12T17:47:09Z |
publishDate | 2016-05-01 |
publisher | SAGE Publishing |
record_format | Article |
series | International Journal of Advanced Robotic Systems |
spelling | doaj.art-b8d2e3c773274d9598db41b928cd571b2022-12-22T00:16:54ZengSAGE PublishingInternational Journal of Advanced Robotic Systems1729-88142016-05-011310.5772/6401110.5772_64011A Visual-Aided Inertial Navigation and Mapping SystemRodrigo Munguía0Emmanuel Nuño1Carlos I. Aldana2Sarquis Urzua3 University of Guadalajara, Guadalajara, Jalisco, Mexico University of Guadalajara, Guadalajara, Jalisco, Mexico University of Guadalajara, Guadalajara, Jalisco, Mexico University of Guadalajara, Guadalajara, Jalisco, MexicoState estimation is a fundamental necessity for any application involving autonomous robots. This paper describes a visual-aided inertial navigation and mapping system for application to autonomous robots. The system, which relies on Kalman filtering, is designed to fuse the measurements obtained from a monocular camera, an inertial measurement unit (IMU) and a position sensor (GPS). The estimated state consists of the full state of the vehicle: the position, orientation, their first derivatives and the parameter errors of the inertial sensors (i.e., the bias of gyroscopes and accelerometers). The system also provides the spatial locations of the visual features observed by the camera. The proposed scheme was designed by considering the limited resources commonly available in small mobile robots, while it is intended to be applied to cluttered environments in order to perform fully vision-based navigation in periods where the position sensor is not available. Moreover, the estimated map of visual features would be suitable for multiple tasks: i) terrain analysis; ii) three-dimensional (3D) scene reconstruction; iii) localization, detection or perception of obstacles and generating trajectories to navigate around these obstacles; and iv) autonomous exploration. In this work, simulations and experiments with real data are presented in order to validate and demonstrate the performance of the proposal.https://doi.org/10.5772/64011 |
spellingShingle | Rodrigo Munguía Emmanuel Nuño Carlos I. Aldana Sarquis Urzua A Visual-Aided Inertial Navigation and Mapping System International Journal of Advanced Robotic Systems |
title | A Visual-Aided Inertial Navigation and Mapping System |
title_full | A Visual-Aided Inertial Navigation and Mapping System |
title_fullStr | A Visual-Aided Inertial Navigation and Mapping System |
title_full_unstemmed | A Visual-Aided Inertial Navigation and Mapping System |
title_short | A Visual-Aided Inertial Navigation and Mapping System |
title_sort | visual aided inertial navigation and mapping system |
url | https://doi.org/10.5772/64011 |
work_keys_str_mv | AT rodrigomunguia avisualaidedinertialnavigationandmappingsystem AT emmanuelnuno avisualaidedinertialnavigationandmappingsystem AT carlosialdana avisualaidedinertialnavigationandmappingsystem AT sarquisurzua avisualaidedinertialnavigationandmappingsystem AT rodrigomunguia visualaidedinertialnavigationandmappingsystem AT emmanuelnuno visualaidedinertialnavigationandmappingsystem AT carlosialdana visualaidedinertialnavigationandmappingsystem AT sarquisurzua visualaidedinertialnavigationandmappingsystem |