Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs
In-flight system failure is one of the major safety concerns in the operation of unmanned aerial vehicles (UAVs) in urban environments. To address this concern, a safety framework consisting of following three main tasks can be utilized: (1) Monitoring health of the UAV and detecting failures, (2) F...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-01-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/22/2/464 |
_version_ | 1797490550502326272 |
---|---|
author | Upesh Nepal Hossein Eslamiat |
author_facet | Upesh Nepal Hossein Eslamiat |
author_sort | Upesh Nepal |
collection | DOAJ |
description | In-flight system failure is one of the major safety concerns in the operation of unmanned aerial vehicles (UAVs) in urban environments. To address this concern, a safety framework consisting of following three main tasks can be utilized: (1) Monitoring health of the UAV and detecting failures, (2) Finding potential safe landing spots in case a critical failure is detected in step 1, and (3) Steering the UAV to a safe landing spot found in step 2. In this paper, we specifically look at the second task, where we investigate the feasibility of utilizing object detection methods to spot safe landing spots in case the UAV suffers an in-flight failure. Particularly, we investigate different versions of the YOLO objection detection method and compare their performances for the specific application of detecting a safe landing location for a UAV that has suffered an in-flight failure. We compare the performance of YOLOv3, YOLOv4, and YOLOv5l while training them by a large aerial image dataset called DOTA in a Personal Computer (PC) and also a Companion Computer (CC). We plan to use the chosen algorithm on a CC that can be attached to a UAV, and the PC is used to verify the trends that we see between the algorithms on the CC. We confirm the feasibility of utilizing these algorithms for effective emergency landing spot detection and report their accuracy and speed for that specific application. Our investigation also shows that the YOLOv5l algorithm outperforms YOLOv4 and YOLOv3 in terms of accuracy of detection while maintaining a slightly slower inference speed. |
first_indexed | 2024-03-10T00:34:35Z |
format | Article |
id | doaj.art-95e1e52639cb4f0bb982f6c125f9e307 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T00:34:35Z |
publishDate | 2022-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-95e1e52639cb4f0bb982f6c125f9e3072023-11-23T15:19:03ZengMDPI AGSensors1424-82202022-01-0122246410.3390/s22020464Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVsUpesh Nepal0Hossein Eslamiat1Mechanical, Aerospace, and Material Engineering, Southern Illinois University Carbondale, 1230 Lincoln Dr, Carbondale, IL 62901, USAMechanical, Aerospace, and Material Engineering, Southern Illinois University Carbondale, 1230 Lincoln Dr, Carbondale, IL 62901, USAIn-flight system failure is one of the major safety concerns in the operation of unmanned aerial vehicles (UAVs) in urban environments. To address this concern, a safety framework consisting of following three main tasks can be utilized: (1) Monitoring health of the UAV and detecting failures, (2) Finding potential safe landing spots in case a critical failure is detected in step 1, and (3) Steering the UAV to a safe landing spot found in step 2. In this paper, we specifically look at the second task, where we investigate the feasibility of utilizing object detection methods to spot safe landing spots in case the UAV suffers an in-flight failure. Particularly, we investigate different versions of the YOLO objection detection method and compare their performances for the specific application of detecting a safe landing location for a UAV that has suffered an in-flight failure. We compare the performance of YOLOv3, YOLOv4, and YOLOv5l while training them by a large aerial image dataset called DOTA in a Personal Computer (PC) and also a Companion Computer (CC). We plan to use the chosen algorithm on a CC that can be attached to a UAV, and the PC is used to verify the trends that we see between the algorithms on the CC. We confirm the feasibility of utilizing these algorithms for effective emergency landing spot detection and report their accuracy and speed for that specific application. Our investigation also shows that the YOLOv5l algorithm outperforms YOLOv4 and YOLOv3 in terms of accuracy of detection while maintaining a slightly slower inference speed.https://www.mdpi.com/1424-8220/22/2/464object detectionDOTA aerial image datasetdeep learningYOLOv3YOLOv4YOLOv5 |
spellingShingle | Upesh Nepal Hossein Eslamiat Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs Sensors object detection DOTA aerial image dataset deep learning YOLOv3 YOLOv4 YOLOv5 |
title | Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs |
title_full | Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs |
title_fullStr | Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs |
title_full_unstemmed | Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs |
title_short | Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs |
title_sort | comparing yolov3 yolov4 and yolov5 for autonomous landing spot detection in faulty uavs |
topic | object detection DOTA aerial image dataset deep learning YOLOv3 YOLOv4 YOLOv5 |
url | https://www.mdpi.com/1424-8220/22/2/464 |
work_keys_str_mv | AT upeshnepal comparingyolov3yolov4andyolov5forautonomouslandingspotdetectioninfaultyuavs AT hosseineslamiat comparingyolov3yolov4andyolov5forautonomouslandingspotdetectioninfaultyuavs |