UNMANNED AERIAL VEHICLE PHOTOGRAMMETRIC PRODUCTS ACCURACY ASSESSMENT: A REVIEW

Digital photogrammetry is an effective way for gathering data for DEM extraction, and recent advances in recording techniques and data processing have allowed for higher resolution and faster rapid generation of photogrammetric 3D models result. The model has a spatial and spectral high-resolution a...

Full description

Bibliographic Details
Main Authors: L. Rabiu, A. Ahmad
Format: Article
Language:English
Published: Copernicus Publications 2023-02-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLVIII-4-W6-2022/279/2023/isprs-archives-XLVIII-4-W6-2022-279-2023.pdf
Description
Summary:Digital photogrammetry is an effective way for gathering data for DEM extraction, and recent advances in recording techniques and data processing have allowed for higher resolution and faster rapid generation of photogrammetric 3D models result. The model has a spatial and spectral high-resolution advantage with good geometrical positioning accuracy. The DEM quality is the primary requirement for any application and must satisfy users’ requirement. The DEM quality is usually affected by several factors during acquisition and processing stages. Considerable researches have been conducted on several parameters influencing the DEM accuracy. The review focused on discussions on topics related to unmanned aerial vehicle DEM accuracy assessment. Five parameters were considered: UAV technology; UAV Georeferencing; UAV and computer vision; UAV and LiDAR; and UAV flight parameters. Summary of the methods, their strength, weakness and regions of the most recent articles are presented. Based on this review conclusion was drawn on the UAV DEM accuracy challenging issues that need more attention from the geospatial community and suggestions for future work are offered. But there might be other possible factors that are not treated in this paper.
ISSN:1682-1750
2194-9034