VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization
Camera, inertial measurement unit (IMU), and ultra-wideband (UWB) sensors are commonplace solutions to unmanned aerial vehicle (UAV) localization problems. The performance of a localization system can be improved by integrating observations from different sensors. In this paper, we propose a learnin...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10131904/ |
_version_ | 1797796561180164096 |
---|---|
author | Peng-Yuan Kao Hsiu-Jui Chang Kuan-Wei Tseng Timothy Chen He-Lin Luo Yi-Ping Hung |
author_facet | Peng-Yuan Kao Hsiu-Jui Chang Kuan-Wei Tseng Timothy Chen He-Lin Luo Yi-Ping Hung |
author_sort | Peng-Yuan Kao |
collection | DOAJ |
description | Camera, inertial measurement unit (IMU), and ultra-wideband (UWB) sensors are commonplace solutions to unmanned aerial vehicle (UAV) localization problems. The performance of a localization system can be improved by integrating observations from different sensors. In this paper, we propose a learning-based UAV localization method using the fusion of vision, IMU, and UWB sensors. Our model consists of visual–inertial (VI) and UWB branches. We combine the estimation results of both branches to predict global poses. To evaluate our method, we augment a public VI dataset with UWB simulations and conduct a real-world experiment. The experimental results show that our method provides more robust and accurate results than VI/UWB-only localization. Our codes and data are available at <uri>https://imlabntu.github.io/VIUNet/</uri>. |
first_indexed | 2024-03-13T03:34:52Z |
format | Article |
id | doaj.art-d8b91e1a44ed423cb0be5f28ec1a762a |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-13T03:34:52Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-d8b91e1a44ed423cb0be5f28ec1a762a2023-06-23T23:00:30ZengIEEEIEEE Access2169-35362023-01-0111615256153410.1109/ACCESS.2023.327929210131904VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV LocalizationPeng-Yuan Kao0https://orcid.org/0000-0002-5582-1039Hsiu-Jui Chang1Kuan-Wei Tseng2https://orcid.org/0000-0003-1134-5314Timothy Chen3https://orcid.org/0000-0001-7900-890XHe-Lin Luo4https://orcid.org/0000-0001-9788-3863Yi-Ping Hung5https://orcid.org/0009-0007-3792-9509Graduate Institute of Networking and Multimedia, National Taiwan University, Taipei, TaiwanDepartment of Computer Science and Information Engineering, National Taiwan University, Taipei, TaiwanDepartment of Computer Science, Tokyo Institute of Technology, Tokyo, JapanDepartment of Computer Science and Information Engineering, National Taiwan University, Taipei, TaiwanGraduate Institute of Animation and Film Art, Tainan National University of the Arts, Tainan, TaiwanGraduate Institute of Networking and Multimedia, National Taiwan University, Taipei, TaiwanCamera, inertial measurement unit (IMU), and ultra-wideband (UWB) sensors are commonplace solutions to unmanned aerial vehicle (UAV) localization problems. The performance of a localization system can be improved by integrating observations from different sensors. In this paper, we propose a learning-based UAV localization method using the fusion of vision, IMU, and UWB sensors. Our model consists of visual–inertial (VI) and UWB branches. We combine the estimation results of both branches to predict global poses. To evaluate our method, we augment a public VI dataset with UWB simulations and conduct a real-world experiment. The experimental results show that our method provides more robust and accurate results than VI/UWB-only localization. Our codes and data are available at <uri>https://imlabntu.github.io/VIUNet/</uri>.https://ieeexplore.ieee.org/document/10131904/Visual-inertial odometryultra-widebandsensor fusiondeep learning |
spellingShingle | Peng-Yuan Kao Hsiu-Jui Chang Kuan-Wei Tseng Timothy Chen He-Lin Luo Yi-Ping Hung VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization IEEE Access Visual-inertial odometry ultra-wideband sensor fusion deep learning |
title | VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization |
title_full | VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization |
title_fullStr | VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization |
title_full_unstemmed | VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization |
title_short | VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization |
title_sort | viunet deep visual x2013 inertial x2013 uwb fusion for indoor uav localization |
topic | Visual-inertial odometry ultra-wideband sensor fusion deep learning |
url | https://ieeexplore.ieee.org/document/10131904/ |
work_keys_str_mv | AT pengyuankao viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization AT hsiujuichang viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization AT kuanweitseng viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization AT timothychen viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization AT helinluo viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization AT yipinghung viunetdeepvisualx2013inertialx2013uwbfusionforindooruavlocalization |