Vision-Based SLAM System for Unmanned Aerial Vehicles

The present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measuremen...

Full description

Bibliographic Details
Main Authors: Rodrigo Munguía, Sarquis Urzua, Yolanda Bolea, Antoni Grau
Format: Article
Language:English
Published: MDPI AG 2016-03-01
Series:Sensors
Subjects:
Online Access:http://www.mdpi.com/1424-8220/16/3/372
_version_ 1828365235457622016
author Rodrigo Munguía
Sarquis Urzua
Yolanda Bolea
Antoni Grau
author_facet Rodrigo Munguía
Sarquis Urzua
Yolanda Bolea
Antoni Grau
author_sort Rodrigo Munguía
collection DOAJ
description The present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measurements obtained from: (i) an orientation sensor (AHRS); (ii) a position sensor (GPS); and (iii) a monocular camera. The estimated state consists of the full state of the vehicle: position and orientation and their first derivatives, as well as the location of the landmarks observed by the camera. The position sensor will be used only during the initialization period in order to recover the metric scale of the world. Afterwards, the estimated map of landmarks will be used to perform a fully vision-based navigation when the position sensor is not available. Experimental results obtained with simulations and real data show the benefits of the inclusion of camera measurements into the system. In this sense the estimation of the trajectory of the vehicle is considerably improved, compared with the estimates obtained using only the measurements from the position sensor, which are commonly low-rated and highly noisy.
first_indexed 2024-04-14T05:28:15Z
format Article
id doaj.art-76828559de654811b81a51a184e453ab
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-04-14T05:28:15Z
publishDate 2016-03-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-76828559de654811b81a51a184e453ab2022-12-22T02:09:55ZengMDPI AGSensors1424-82202016-03-0116337210.3390/s16030372s16030372Vision-Based SLAM System for Unmanned Aerial VehiclesRodrigo Munguía0Sarquis Urzua1Yolanda Bolea2Antoni Grau3Department of Automatic Control, Technical University of Catalonia UPC, Barcelona 08036, SpainDepartment of Computer Science, CUCEI, University of Guadalajara, Guadalajara 44430, MexicoDepartment of Automatic Control, Technical University of Catalonia UPC, Barcelona 08036, SpainDepartment of Automatic Control, Technical University of Catalonia UPC, Barcelona 08036, SpainThe present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measurements obtained from: (i) an orientation sensor (AHRS); (ii) a position sensor (GPS); and (iii) a monocular camera. The estimated state consists of the full state of the vehicle: position and orientation and their first derivatives, as well as the location of the landmarks observed by the camera. The position sensor will be used only during the initialization period in order to recover the metric scale of the world. Afterwards, the estimated map of landmarks will be used to perform a fully vision-based navigation when the position sensor is not available. Experimental results obtained with simulations and real data show the benefits of the inclusion of camera measurements into the system. In this sense the estimation of the trajectory of the vehicle is considerably improved, compared with the estimates obtained using only the measurements from the position sensor, which are commonly low-rated and highly noisy.http://www.mdpi.com/1424-8220/16/3/372state estimationunmanned aerial vehiclemonocular visionlocalizationmapping
spellingShingle Rodrigo Munguía
Sarquis Urzua
Yolanda Bolea
Antoni Grau
Vision-Based SLAM System for Unmanned Aerial Vehicles
Sensors
state estimation
unmanned aerial vehicle
monocular vision
localization
mapping
title Vision-Based SLAM System for Unmanned Aerial Vehicles
title_full Vision-Based SLAM System for Unmanned Aerial Vehicles
title_fullStr Vision-Based SLAM System for Unmanned Aerial Vehicles
title_full_unstemmed Vision-Based SLAM System for Unmanned Aerial Vehicles
title_short Vision-Based SLAM System for Unmanned Aerial Vehicles
title_sort vision based slam system for unmanned aerial vehicles
topic state estimation
unmanned aerial vehicle
monocular vision
localization
mapping
url http://www.mdpi.com/1424-8220/16/3/372
work_keys_str_mv AT rodrigomunguia visionbasedslamsystemforunmannedaerialvehicles
AT sarquisurzua visionbasedslamsystemforunmannedaerialvehicles
AT yolandabolea visionbasedslamsystemforunmannedaerialvehicles
AT antonigrau visionbasedslamsystemforunmannedaerialvehicles