Methodology of Detection and Classification of Selected Aviation Obstacles Based on UAV Dense Image Matching

Currently, more and more accurate data provided by UAVs make it possible to analyze land cover, which requires the detection of objects and their individual elements. Object detection and determination of their geometric features is possible thanks to dense point clouds generated based on imagery ob...

Full description

Bibliographic Details
Main Authors: Marta Lalak, Damian Wierzbicki
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9706249/
Description
Summary:Currently, more and more accurate data provided by UAVs make it possible to analyze land cover, which requires the detection of objects and their individual elements. Object detection and determination of their geometric features is possible thanks to dense point clouds generated based on imagery obtained from low altitudes. 3D data from UAVs turn out to be extremely useful for ensuring safety in the airspace in the close vicinity of the airport. This article presents the methodology of automatic aviation obstacle detection based on low altitude data (UAV). The research was carried out on a dense 3D point cloud. The developed methodology for detecting aviation obstacles consists of three main stages. The first is point cloud filtration based on height–preliminary identification of aviation obstacles, followed by 3D point cloud segmentation using a modified RANSAC algorithm, supplemented with two-dimensional vector data of aviation obstacles to improve the accuracy of the segmentation process. The last stage is the classification of aviation obstacles according to the adopted height and cross-section criterion. The proposed method of detecting aviation obstacles is characterized by high accuracy. The mean error of fitting the point cloud to the obstacle database ranged from ± 0.04 m to ± 0.07 m.
ISSN:2151-1535