DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES
This study investigates the application of deep learning techniques, specifically ResNet architectures, to automate crop type identification using remotely sensed data collected by a DJI Mavic Air drone. The imagery was captured at an altitude of 30 meters, maintaining an average airspeed of 5 m/s,...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Copernicus Publications
2023-12-01
|
Series: | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Online Access: | https://isprs-annals.copernicus.org/articles/X-1-W1-2023/991/2023/isprs-annals-X-1-W1-2023-991-2023.pdf |
_version_ | 1827594715781922816 |
---|---|
author | O. G. Ajayi O. O. Olufade |
author_facet | O. G. Ajayi O. O. Olufade |
author_sort | O. G. Ajayi |
collection | DOAJ |
description | This study investigates the application of deep learning techniques, specifically ResNet architectures, to automate crop type identification using remotely sensed data collected by a DJI Mavic Air drone. The imagery was captured at an altitude of 30 meters, maintaining an average airspeed of 5 m/s, and ensuring a front and side overlap of 75% and 65%, respectively. The pre-flight planning and image acquisition was facilitated through the Drone Deploy platform, yielding a dataset consisting of 1488 aerial photographs covering the study area. These images possess an average ground sampling distance (GSD) of 22.2 millimetres. The dataset was meticulously labelled with "maize" and employed to train three distinct ResNet architectures, namely ResNet-50, ResNet-101, and ResNet-152. The evaluation of these models was based on accuracy and processing time. Notably, ResNet-50 emerged as the most proficient, achieving an accuracy rate of 82% with a precision score of 0.5 after just two hours of initial training, while ResNet-101 and ResNet-152 architectures achieved 27% and 24% accuracy, respectively. These outcomes underscore the potential of ResNet-50 architecture, even with a limited dataset, as a valuable tool for precise crop-type classification within the precision agriculture domain. |
first_indexed | 2024-03-09T02:40:24Z |
format | Article |
id | doaj.art-755384c88f254d3282f7980943b51008 |
institution | Directory Open Access Journal |
issn | 2194-9042 2194-9050 |
language | English |
last_indexed | 2024-03-09T02:40:24Z |
publishDate | 2023-12-01 |
publisher | Copernicus Publications |
record_format | Article |
series | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
spelling | doaj.art-755384c88f254d3282f7980943b510082023-12-06T04:51:15ZengCopernicus PublicationsISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences2194-90422194-90502023-12-01X-1-W1-202399199810.5194/isprs-annals-X-1-W1-2023-991-2023DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURESO. G. Ajayi0O. O. Olufade1Department of Land and Spatial Sciences, Namibia University of Science and Technology, Windhoek, NamibiaDepartment of Surveying and Geoinformatics, Federal University of Technology, Minna, NigeriaThis study investigates the application of deep learning techniques, specifically ResNet architectures, to automate crop type identification using remotely sensed data collected by a DJI Mavic Air drone. The imagery was captured at an altitude of 30 meters, maintaining an average airspeed of 5 m/s, and ensuring a front and side overlap of 75% and 65%, respectively. The pre-flight planning and image acquisition was facilitated through the Drone Deploy platform, yielding a dataset consisting of 1488 aerial photographs covering the study area. These images possess an average ground sampling distance (GSD) of 22.2 millimetres. The dataset was meticulously labelled with "maize" and employed to train three distinct ResNet architectures, namely ResNet-50, ResNet-101, and ResNet-152. The evaluation of these models was based on accuracy and processing time. Notably, ResNet-50 emerged as the most proficient, achieving an accuracy rate of 82% with a precision score of 0.5 after just two hours of initial training, while ResNet-101 and ResNet-152 architectures achieved 27% and 24% accuracy, respectively. These outcomes underscore the potential of ResNet-50 architecture, even with a limited dataset, as a valuable tool for precise crop-type classification within the precision agriculture domain.https://isprs-annals.copernicus.org/articles/X-1-W1-2023/991/2023/isprs-annals-X-1-W1-2023-991-2023.pdf |
spellingShingle | O. G. Ajayi O. O. Olufade DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
title | DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES |
title_full | DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES |
title_fullStr | DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES |
title_full_unstemmed | DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES |
title_short | DRONE-BASED CROP TYPE IDENTIFICATION WITH CONVOLUTIONAL NEURAL NETWORKS: AN EVALUATION OF THE PERFORMANCE OF RESNET ARCHITECTURES |
title_sort | drone based crop type identification with convolutional neural networks an evaluation of the performance of resnet architectures |
url | https://isprs-annals.copernicus.org/articles/X-1-W1-2023/991/2023/isprs-annals-X-1-W1-2023-991-2023.pdf |
work_keys_str_mv | AT ogajayi dronebasedcroptypeidentificationwithconvolutionalneuralnetworksanevaluationoftheperformanceofresnetarchitectures AT ooolufade dronebasedcroptypeidentificationwithconvolutionalneuralnetworksanevaluationoftheperformanceofresnetarchitectures |