Visual and inertial odometry for mobile robot

In the recent year, application of unmanned aerial vehicle (UAV) has been well received by both consumer and industry application due to its versatility and cost effectiveness. During the Singapore International Water Week, national water agency (PUB) showcased its smart technologies to perform vari...

Full description

Bibliographic Details
Main Author: Leong, Jing Chen
Other Authors: Seet Gim Lee, Gerald
Format: Final Year Project (FYP)
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/78537
_version_ 1826120209615814656
author Leong, Jing Chen
author2 Seet Gim Lee, Gerald
author_facet Seet Gim Lee, Gerald
Leong, Jing Chen
author_sort Leong, Jing Chen
collection NTU
description In the recent year, application of unmanned aerial vehicle (UAV) has been well received by both consumer and industry application due to its versatility and cost effectiveness. During the Singapore International Water Week, national water agency (PUB) showcased its smart technologies to perform various complex tasks, such as deep tunnel sewerage system inspection [1]. Instead of carrying out inspection by inspection personnel, PUB deployed UAV to perform inspections in the deep tunnel sewerage system environment which is hostile for human. For the UAV to manoeuvre in the environment, it must be able to localize itself. To localize an UAV, most developer incorporated global positioning system (GPS) into the UAV. However, GPS system is ineffective to determine the position of UAV when GPS signal is weak or absent, such as operating in urban indoor environment and tunnel network. To accurately determine the location of an UAV, another method is to incorporate sensors such as inertial measurement unit (IMU), cameras and Lidar. The objective of this project is to localize an UAV with respect to the features in the environment for tunnel and indoor task using vision and inertial sensors. In tunnel environment, the number of features could be scarce and less illuminated. Camera performs ineffectively when the environment is dark and travelling at higher flight speed due to motion blur, while IMU is prone to drifting when held stationary or travelling at lower speed due to higher noise exhibit when IMU is held static. In such condition, IMU will complement camera when travelling at higher flight velocity where the IMU is effective at, meanwhile camera will complement IMU at lower speed, where the camera is effective to determine movement changes. Throughout the project, several challenges were encountered, such as camera and IMU calibration issue, and hardware synchronization issue between the camera and IMU. In this project, an UAV capable to self-localize was setup, with integration of on-board computing unit, camera and IMU. Prior to the state estimation, camera calibration, IMU calibration and hardware synchronization of camera and IMU was carried out. To perform the state estimation, VINS-Mono state estimator was utilized. Subsequently, experimental evaluation was carried out to compare the UAV system ii localization performance with ground truth data at Motion Analysis Laboratory of Nanyang Technological University. Apart from that, a comparison study was made to determine the robustness and reliability of VINS-Mono state estimator and the UAV system using various flight velocities and environment features settings. Based on the results obtained, it can be concluded that the implementation is practical with a synchronized camera-IMU setup as it is capable to localize an UAV in an GPS-denied environment, with an average root mean square error kept under 25cm.
first_indexed 2024-10-01T05:12:47Z
format Final Year Project (FYP)
id ntu-10356/78537
institution Nanyang Technological University
language English
last_indexed 2024-10-01T05:12:47Z
publishDate 2019
record_format dspace
spelling ntu-10356/785372023-03-04T19:33:41Z Visual and inertial odometry for mobile robot Leong, Jing Chen Seet Gim Lee, Gerald School of Mechanical and Aerospace Engineering Robotics Research Centre DRNTU::Engineering::Mechanical engineering In the recent year, application of unmanned aerial vehicle (UAV) has been well received by both consumer and industry application due to its versatility and cost effectiveness. During the Singapore International Water Week, national water agency (PUB) showcased its smart technologies to perform various complex tasks, such as deep tunnel sewerage system inspection [1]. Instead of carrying out inspection by inspection personnel, PUB deployed UAV to perform inspections in the deep tunnel sewerage system environment which is hostile for human. For the UAV to manoeuvre in the environment, it must be able to localize itself. To localize an UAV, most developer incorporated global positioning system (GPS) into the UAV. However, GPS system is ineffective to determine the position of UAV when GPS signal is weak or absent, such as operating in urban indoor environment and tunnel network. To accurately determine the location of an UAV, another method is to incorporate sensors such as inertial measurement unit (IMU), cameras and Lidar. The objective of this project is to localize an UAV with respect to the features in the environment for tunnel and indoor task using vision and inertial sensors. In tunnel environment, the number of features could be scarce and less illuminated. Camera performs ineffectively when the environment is dark and travelling at higher flight speed due to motion blur, while IMU is prone to drifting when held stationary or travelling at lower speed due to higher noise exhibit when IMU is held static. In such condition, IMU will complement camera when travelling at higher flight velocity where the IMU is effective at, meanwhile camera will complement IMU at lower speed, where the camera is effective to determine movement changes. Throughout the project, several challenges were encountered, such as camera and IMU calibration issue, and hardware synchronization issue between the camera and IMU. In this project, an UAV capable to self-localize was setup, with integration of on-board computing unit, camera and IMU. Prior to the state estimation, camera calibration, IMU calibration and hardware synchronization of camera and IMU was carried out. To perform the state estimation, VINS-Mono state estimator was utilized. Subsequently, experimental evaluation was carried out to compare the UAV system ii localization performance with ground truth data at Motion Analysis Laboratory of Nanyang Technological University. Apart from that, a comparison study was made to determine the robustness and reliability of VINS-Mono state estimator and the UAV system using various flight velocities and environment features settings. Based on the results obtained, it can be concluded that the implementation is practical with a synchronized camera-IMU setup as it is capable to localize an UAV in an GPS-denied environment, with an average root mean square error kept under 25cm. Bachelor of Engineering (Mechanical Engineering) 2019-06-21T04:02:48Z 2019-06-21T04:02:48Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/78537 en Nanyang Technological University 76 p. application/pdf
spellingShingle DRNTU::Engineering::Mechanical engineering
Leong, Jing Chen
Visual and inertial odometry for mobile robot
title Visual and inertial odometry for mobile robot
title_full Visual and inertial odometry for mobile robot
title_fullStr Visual and inertial odometry for mobile robot
title_full_unstemmed Visual and inertial odometry for mobile robot
title_short Visual and inertial odometry for mobile robot
title_sort visual and inertial odometry for mobile robot
topic DRNTU::Engineering::Mechanical engineering
url http://hdl.handle.net/10356/78537
work_keys_str_mv AT leongjingchen visualandinertialodometryformobilerobot