Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)

The Unmanned Aerial Vehicle (UAV) is one of the most remarkable inventions of the last 100 years. Much research has been invested in the development of this flying robot. The landing system is one of the more challenging aspects of this system’s development. Artificial Intelligence (AI) has become t...

Full description

Bibliographic Details
Main Authors: Mohammad Sefidgar, Rene Landry
Format: Article
Language:English
Published: MDPI AG 2022-02-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/5/1870
_version_ 1797473782014672896
author Mohammad Sefidgar
Rene Landry
author_facet Mohammad Sefidgar
Rene Landry
author_sort Mohammad Sefidgar
collection DOAJ
description The Unmanned Aerial Vehicle (UAV) is one of the most remarkable inventions of the last 100 years. Much research has been invested in the development of this flying robot. The landing system is one of the more challenging aspects of this system’s development. Artificial Intelligence (AI) has become the preferred technique for landing system development, including reinforcement learning. However, current research is more focused is on system development based on image processing and advanced geometry. A novel calibration based on our previous research had been used to ameliorate the accuracy of the AprilTag pose estimation. With the help of advanced geometry from camera and range sensor data, a process known as Inverse Homography Range Camera Fusion (IHRCF), a pose estimation that outperforms our previous work, is now possible. The range sensor used here is a Time of Flight (ToF) sensor, but the algorithm can be used with any range sensor. First, images are captured by the image acquisition device, a monocular camera. Next, the corners of the landing landmark are detected through AprilTag detection algorithms (ATDA). The pixel correspondence between the image and the range sensor is then calculated via the calibration data. In the succeeding phase, the planar homography between the real-world locations of sensor data and their obtained pixel coordinates is calculated. In the next phase, the pixel coordinates of the AprilTag-detected four corners are transformed by inverse planar homography from pixel coordinates to world coordinates in the camera frame. Finally, knowing the world frame corner points of the AprilTag, rigid body transformation can be used to create the pose data. A CoppeliaSim simulation environment was used to evaluate the IHRCF algorithm, and the test was implemented in real-time Software-in-the-Loop (SIL). The IHRCF algorithm outperformed the AprilTag-only detection approach significantly in both translational and rotational terms. To conclude, the conventional landmark detection algorithm can be ameliorated by incorporating sensor fusion for cameras with lower radial distortion.
first_indexed 2024-03-09T20:21:19Z
format Article
id doaj.art-5b715e3b9db5481985c4ff1584638831
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T20:21:19Z
publishDate 2022-02-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-5b715e3b9db5481985c4ff15846388312023-11-23T23:47:32ZengMDPI AGSensors1424-82202022-02-01225187010.3390/s22051870Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)Mohammad Sefidgar0Rene Landry1LASSENA Laboratory, École de Technologies Supérieure (ÉTS), Montreal, QC H3C 1K3, CanadaLASSENA Laboratory, École de Technologies Supérieure (ÉTS), Montreal, QC H3C 1K3, CanadaThe Unmanned Aerial Vehicle (UAV) is one of the most remarkable inventions of the last 100 years. Much research has been invested in the development of this flying robot. The landing system is one of the more challenging aspects of this system’s development. Artificial Intelligence (AI) has become the preferred technique for landing system development, including reinforcement learning. However, current research is more focused is on system development based on image processing and advanced geometry. A novel calibration based on our previous research had been used to ameliorate the accuracy of the AprilTag pose estimation. With the help of advanced geometry from camera and range sensor data, a process known as Inverse Homography Range Camera Fusion (IHRCF), a pose estimation that outperforms our previous work, is now possible. The range sensor used here is a Time of Flight (ToF) sensor, but the algorithm can be used with any range sensor. First, images are captured by the image acquisition device, a monocular camera. Next, the corners of the landing landmark are detected through AprilTag detection algorithms (ATDA). The pixel correspondence between the image and the range sensor is then calculated via the calibration data. In the succeeding phase, the planar homography between the real-world locations of sensor data and their obtained pixel coordinates is calculated. In the next phase, the pixel coordinates of the AprilTag-detected four corners are transformed by inverse planar homography from pixel coordinates to world coordinates in the camera frame. Finally, knowing the world frame corner points of the AprilTag, rigid body transformation can be used to create the pose data. A CoppeliaSim simulation environment was used to evaluate the IHRCF algorithm, and the test was implemented in real-time Software-in-the-Loop (SIL). The IHRCF algorithm outperformed the AprilTag-only detection approach significantly in both translational and rotational terms. To conclude, the conventional landmark detection algorithm can be ameliorated by incorporating sensor fusion for cameras with lower radial distortion.https://www.mdpi.com/1424-8220/22/5/1870inverse planar homographysensor fusionnavigation landing system designpose estimation
spellingShingle Mohammad Sefidgar
Rene Landry
Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
Sensors
inverse planar homography
sensor fusion
navigation landing system design
pose estimation
title Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
title_full Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
title_fullStr Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
title_full_unstemmed Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
title_short Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF)
title_sort landing system development based on inverse homography range camera fusion ihrcf
topic inverse planar homography
sensor fusion
navigation landing system design
pose estimation
url https://www.mdpi.com/1424-8220/22/5/1870
work_keys_str_mv AT mohammadsefidgar landingsystemdevelopmentbasedoninversehomographyrangecamerafusionihrcf
AT renelandry landingsystemdevelopmentbasedoninversehomographyrangecamerafusionihrcf