An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite

Because of the high complementarity between global navigation satellite systems (GNSSs) and visual-inertial odometry (VIO), integrated GNSS-VIO navigation technology has been the subject of increased attention in recent years. In this paper, we propose an embedded high-precision multi-sensor fusion...

Full description

Bibliographic Details
Main Authors: Cheng Liu, Shuai Xiong, Yongchao Geng, Song Cheng, Fang Hu, Bo Shao, Fang Li, Jie Zhang
Format: Article
Language:English
Published: Institute of Navigation 2023-08-01
Series:Navigation
Online Access:https://navi.ion.org/content/70/4/navi.607
_version_ 1797393371893858304
author Cheng Liu
Shuai Xiong
Yongchao Geng
Song Cheng
Fang Hu
Bo Shao
Fang Li
Jie Zhang
author_facet Cheng Liu
Shuai Xiong
Yongchao Geng
Song Cheng
Fang Hu
Bo Shao
Fang Li
Jie Zhang
author_sort Cheng Liu
collection DOAJ
description Because of the high complementarity between global navigation satellite systems (GNSSs) and visual-inertial odometry (VIO), integrated GNSS-VIO navigation technology has been the subject of increased attention in recent years. In this paper, we propose an embedded high-precision multi-sensor fusion suite that includes a multi-frequency and multi-constellation GNSS module, a consumption-grade inertial measurement unit (IMU), and a grayscale camera. The suite uses an NVIDIA Jetson Xavier NX as the host and develops a Field Programmable Gate Array-based controller for hardware time synchronization between heterogeneous sensors. A multi-state constraint Kalman filter is used to generate the tightly-coupled estimation from the camera and the IMU. As a result, the GNSS output is loosely coupled to facilitate the acquisition of the global drift-free estimation. Results from the calibration reveal that the time synchronization accuracy of the suite is better than 30 µs (standard deviation [STD]) and that the projection error of camera-IMU is less than 0.1 pixels (STD); these results highlight the advantage of this hardware time synchronization mechanism. Results from the vehicle-mounted tests reveal reductions in the three-dimensional (3D) positioning error from 8.455 m to 5.751 m (root mean square) on experimental urban roads, which significantly improves the accuracy and continuity of GNSS individual positioning. In underground sites where the satellite signal is completely unavailable, the 3D position error drift of the suite is only 1.58 ‰, which also shows excellent performance.
first_indexed 2024-03-09T00:01:28Z
format Article
id doaj.art-f5ce44bc3b6c4bb992e9fffc29f07f2b
institution Directory Open Access Journal
issn 2161-4296
language English
last_indexed 2024-03-09T00:01:28Z
publishDate 2023-08-01
publisher Institute of Navigation
record_format Article
series Navigation
spelling doaj.art-f5ce44bc3b6c4bb992e9fffc29f07f2b2023-12-12T17:32:42ZengInstitute of NavigationNavigation2161-42962023-08-0170410.33012/navi.607navi.607An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion SuiteCheng LiuShuai XiongYongchao GengSong ChengFang HuBo ShaoFang LiJie ZhangBecause of the high complementarity between global navigation satellite systems (GNSSs) and visual-inertial odometry (VIO), integrated GNSS-VIO navigation technology has been the subject of increased attention in recent years. In this paper, we propose an embedded high-precision multi-sensor fusion suite that includes a multi-frequency and multi-constellation GNSS module, a consumption-grade inertial measurement unit (IMU), and a grayscale camera. The suite uses an NVIDIA Jetson Xavier NX as the host and develops a Field Programmable Gate Array-based controller for hardware time synchronization between heterogeneous sensors. A multi-state constraint Kalman filter is used to generate the tightly-coupled estimation from the camera and the IMU. As a result, the GNSS output is loosely coupled to facilitate the acquisition of the global drift-free estimation. Results from the calibration reveal that the time synchronization accuracy of the suite is better than 30 µs (standard deviation [STD]) and that the projection error of camera-IMU is less than 0.1 pixels (STD); these results highlight the advantage of this hardware time synchronization mechanism. Results from the vehicle-mounted tests reveal reductions in the three-dimensional (3D) positioning error from 8.455 m to 5.751 m (root mean square) on experimental urban roads, which significantly improves the accuracy and continuity of GNSS individual positioning. In underground sites where the satellite signal is completely unavailable, the 3D position error drift of the suite is only 1.58 ‰, which also shows excellent performance.https://navi.ion.org/content/70/4/navi.607
spellingShingle Cheng Liu
Shuai Xiong
Yongchao Geng
Song Cheng
Fang Hu
Bo Shao
Fang Li
Jie Zhang
An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
Navigation
title An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
title_full An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
title_fullStr An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
title_full_unstemmed An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
title_short An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite
title_sort embedded high precision gnss visual inertial multi sensor fusion suite
url https://navi.ion.org/content/70/4/navi.607
work_keys_str_mv AT chengliu anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT shuaixiong anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT yongchaogeng anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT songcheng anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT fanghu anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT boshao anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT fangli anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT jiezhang anembeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT chengliu embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT shuaixiong embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT yongchaogeng embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT songcheng embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT fanghu embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT boshao embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT fangli embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite
AT jiezhang embeddedhighprecisiongnssvisualinertialmultisensorfusionsuite