Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments

RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using a...

Full description

Bibliographic Details
Main Authors: Bachrach, Abraham Galton, Prentice, Samuel James, He, Ruijie, Huang, Albert S., Roy, Nicholas, Henry, Peter, Krainin, Michael, Maturana, Daniel, Fox, Dieter
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:en_US
Published: Sage Publications 2013
Online Access:http://hdl.handle.net/1721.1/81874
https://orcid.org/0000-0002-4959-7368
https://orcid.org/0000-0002-8293-0492
_version_ 1811087552999849984
author Bachrach, Abraham Galton
Prentice, Samuel James
He, Ruijie
Huang, Albert S.
Roy, Nicholas
Henry, Peter
Krainin, Michael
Maturana, Daniel
Fox, Dieter
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Bachrach, Abraham Galton
Prentice, Samuel James
He, Ruijie
Huang, Albert S.
Roy, Nicholas
Henry, Peter
Krainin, Michael
Maturana, Daniel
Fox, Dieter
author_sort Bachrach, Abraham Galton
collection MIT
description RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.
first_indexed 2024-09-23T13:47:55Z
format Article
id mit-1721.1/81874
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T13:47:55Z
publishDate 2013
publisher Sage Publications
record_format dspace
spelling mit-1721.1/818742022-09-28T16:16:53Z Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments Bachrach, Abraham Galton Prentice, Samuel James He, Ruijie Huang, Albert S. Roy, Nicholas Henry, Peter Krainin, Michael Maturana, Daniel Fox, Dieter Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Aeronautics and Astronautics Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Bachrach, Abraham Galton Prentice, Samuel James He, Ruijie Huang, Albert S. Roy, Nicholas RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations. United States. Office of Naval Research (Grant MURI N00014-07-1-0749) United States. Office of Naval Research (Science of Autonomy Program N00014-09-1-0641) United States. Army Research Office (MAST CTA) United States. Office of Naval Research. Multidisciplinary University Research Initiative (Grant N00014-09-1-1052) National Science Foundation (U.S.) (Contract IIS-0812671) United States. Army Research Office (Robotics Consortium Agreement W911NF-10-2-0016) National Science Foundation (U.S.). Division of Information, Robotics, and Intelligent Systems (Grant 0546467) 2013-10-30T15:04:10Z 2013-10-30T15:04:10Z 2012-09 Article http://purl.org/eprint/type/JournalArticle 0278-3649 1741-3176 http://hdl.handle.net/1721.1/81874 Bachrach, A., S. Prentice, R. He, P. Henry, A. S. Huang, M. Krainin, D. Maturana, D. Fox, and N. Roy. “Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments.” The International Journal of Robotics Research 31, no. 11 (September 11, 2012): 1320-1343. https://orcid.org/0000-0002-4959-7368 https://orcid.org/0000-0002-8293-0492 en_US http://dx.doi.org/10.1177/0278364912455256 The International Journal of Robotics Research Creative Commons Attribution-Noncommercial-Share Alike 3.0 http://creativecommons.org/licenses/by-nc-sa/3.0/ application/pdf Sage Publications MIT web domain
spellingShingle Bachrach, Abraham Galton
Prentice, Samuel James
He, Ruijie
Huang, Albert S.
Roy, Nicholas
Henry, Peter
Krainin, Michael
Maturana, Daniel
Fox, Dieter
Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title_full Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title_fullStr Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title_full_unstemmed Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title_short Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
title_sort estimation planning and mapping for autonomous flight using an rgb d camera in gps denied environments
url http://hdl.handle.net/1721.1/81874
https://orcid.org/0000-0002-4959-7368
https://orcid.org/0000-0002-8293-0492
work_keys_str_mv AT bachrachabrahamgalton estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT prenticesamueljames estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT heruijie estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT huangalberts estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT roynicholas estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT henrypeter estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT kraininmichael estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT maturanadaniel estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments
AT foxdieter estimationplanningandmappingforautonomousflightusinganrgbdcameraingpsdeniedenvironments