VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization

Camera, inertial measurement unit (IMU), and ultra-wideband (UWB) sensors are commonplace solutions to unmanned aerial vehicle (UAV) localization problems. The performance of a localization system can be improved by integrating observations from different sensors. In this paper, we propose a learnin...

Full description

Bibliographic Details
Main Authors: Peng-Yuan Kao, Hsiu-Jui Chang, Kuan-Wei Tseng, Timothy Chen, He-Lin Luo, Yi-Ping Hung
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10131904/
Description
Summary:Camera, inertial measurement unit (IMU), and ultra-wideband (UWB) sensors are commonplace solutions to unmanned aerial vehicle (UAV) localization problems. The performance of a localization system can be improved by integrating observations from different sensors. In this paper, we propose a learning-based UAV localization method using the fusion of vision, IMU, and UWB sensors. Our model consists of visual&#x2013;inertial (VI) and UWB branches. We combine the estimation results of both branches to predict global poses. To evaluate our method, we augment a public VI dataset with UWB simulations and conduct a real-world experiment. The experimental results show that our method provides more robust and accurate results than VI/UWB-only localization. Our codes and data are available at <uri>https://imlabntu.github.io/VIUNet/</uri>.
ISSN:2169-3536