Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments
Autonomous Unmanned Aerial Vehicles (UAVs) have possible applications in wildlife monitoring, disaster monitoring, and emergency Search and Rescue (SAR). Autonomous capabilities such as waypoint flight modes and obstacle avoidance, as well as their ability to survey large areas, make UAVs the prime...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-01-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | https://www.mdpi.com/2072-4292/16/3/471 |
_version_ | 1797318251850498048 |
---|---|
author | Sebastien Boiteau Fernando Vanegas Felipe Gonzalez |
author_facet | Sebastien Boiteau Fernando Vanegas Felipe Gonzalez |
author_sort | Sebastien Boiteau |
collection | DOAJ |
description | Autonomous Unmanned Aerial Vehicles (UAVs) have possible applications in wildlife monitoring, disaster monitoring, and emergency Search and Rescue (SAR). Autonomous capabilities such as waypoint flight modes and obstacle avoidance, as well as their ability to survey large areas, make UAVs the prime choice for these critical applications. However, autonomous UAVs usually rely on the Global Navigation Satellite System (GNSS) for navigation and normal visibility conditions to obtain observations and data on their surrounding environment. These two parameters are often lacking due to the challenging conditions in which these critical applications can take place, limiting the range of utilisation of autonomous UAVs. This paper presents a framework enabling a UAV to autonomously navigate and detect targets in GNSS-denied and visually degraded environments. The navigation and target detection problem is formulated as an autonomous Sequential Decision Problem (SDP) with uncertainty caused by the lack of the GNSS and low visibility. The SDP is modelled as a Partially Observable Markov Decision Process (POMDP) and tested using the Adaptive Belief Tree (ABT) algorithm. The framework is tested in simulations and real life using a navigation task based on a classic SAR operation in a cluttered indoor environment with different visibility conditions. The framework is composed of a small UAV with a weight of 5 kg, a thermal camera used for target detection, and an onboard computer running all the computationally intensive tasks. The results of this study show the robustness of the proposed framework to autonomously explore and detect targets using thermal imagery under different visibility conditions. Devising UAVs that are capable of navigating in challenging environments with degraded visibility can encourage authorities and public institutions to consider the use of autonomous remote platforms to locate stranded people in disaster scenarios. |
first_indexed | 2024-03-08T03:49:44Z |
format | Article |
id | doaj.art-573ac6466808457fa3dbb717bb03336b |
institution | Directory Open Access Journal |
issn | 2072-4292 |
language | English |
last_indexed | 2024-03-08T03:49:44Z |
publishDate | 2024-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Remote Sensing |
spelling | doaj.art-573ac6466808457fa3dbb717bb03336b2024-02-09T15:21:12ZengMDPI AGRemote Sensing2072-42922024-01-0116347110.3390/rs16030471Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded EnvironmentsSebastien Boiteau0Fernando Vanegas1Felipe Gonzalez2School of Electrical Engineering and Robotics, Queensland University of Technology (QUT), 2 George Street, Brisbane City, QLD 4000, AustraliaSchool of Electrical Engineering and Robotics, Queensland University of Technology (QUT), 2 George Street, Brisbane City, QLD 4000, AustraliaSchool of Electrical Engineering and Robotics, Queensland University of Technology (QUT), 2 George Street, Brisbane City, QLD 4000, AustraliaAutonomous Unmanned Aerial Vehicles (UAVs) have possible applications in wildlife monitoring, disaster monitoring, and emergency Search and Rescue (SAR). Autonomous capabilities such as waypoint flight modes and obstacle avoidance, as well as their ability to survey large areas, make UAVs the prime choice for these critical applications. However, autonomous UAVs usually rely on the Global Navigation Satellite System (GNSS) for navigation and normal visibility conditions to obtain observations and data on their surrounding environment. These two parameters are often lacking due to the challenging conditions in which these critical applications can take place, limiting the range of utilisation of autonomous UAVs. This paper presents a framework enabling a UAV to autonomously navigate and detect targets in GNSS-denied and visually degraded environments. The navigation and target detection problem is formulated as an autonomous Sequential Decision Problem (SDP) with uncertainty caused by the lack of the GNSS and low visibility. The SDP is modelled as a Partially Observable Markov Decision Process (POMDP) and tested using the Adaptive Belief Tree (ABT) algorithm. The framework is tested in simulations and real life using a navigation task based on a classic SAR operation in a cluttered indoor environment with different visibility conditions. The framework is composed of a small UAV with a weight of 5 kg, a thermal camera used for target detection, and an onboard computer running all the computationally intensive tasks. The results of this study show the robustness of the proposed framework to autonomously explore and detect targets using thermal imagery under different visibility conditions. Devising UAVs that are capable of navigating in challenging environments with degraded visibility can encourage authorities and public institutions to consider the use of autonomous remote platforms to locate stranded people in disaster scenarios.https://www.mdpi.com/2072-4292/16/3/471partially observable Markov decision processunmanned aerial vehiclessearch and rescuelow visibilityembedded systemsremote sensing |
spellingShingle | Sebastien Boiteau Fernando Vanegas Felipe Gonzalez Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments Remote Sensing partially observable Markov decision process unmanned aerial vehicles search and rescue low visibility embedded systems remote sensing |
title | Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments |
title_full | Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments |
title_fullStr | Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments |
title_full_unstemmed | Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments |
title_short | Framework for Autonomous UAV Navigation and Target Detection in Global-Navigation-Satellite-System-Denied and Visually Degraded Environments |
title_sort | framework for autonomous uav navigation and target detection in global navigation satellite system denied and visually degraded environments |
topic | partially observable Markov decision process unmanned aerial vehicles search and rescue low visibility embedded systems remote sensing |
url | https://www.mdpi.com/2072-4292/16/3/471 |
work_keys_str_mv | AT sebastienboiteau frameworkforautonomousuavnavigationandtargetdetectioninglobalnavigationsatellitesystemdeniedandvisuallydegradedenvironments AT fernandovanegas frameworkforautonomousuavnavigationandtargetdetectioninglobalnavigationsatellitesystemdeniedandvisuallydegradedenvironments AT felipegonzalez frameworkforautonomousuavnavigationandtargetdetectioninglobalnavigationsatellitesystemdeniedandvisuallydegradedenvironments |