Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean
Accurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse study, to...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-08-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | https://www.mdpi.com/2072-4292/13/16/3241 |
_version_ | 1797522172097331200 |
---|---|
author | Amirhossein Hassanzadeh Fei Zhang Jan van Aardt Sean P. Murphy Sarah J. Pethybridge |
author_facet | Amirhossein Hassanzadeh Fei Zhang Jan van Aardt Sean P. Murphy Sarah J. Pethybridge |
author_sort | Amirhossein Hassanzadeh |
collection | DOAJ |
description | Accurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse study, to low-altitude unmanned aerial systems (UAS). Our principal objective was to investigate snap bean crop (<i>Phaseolus vulgaris</i>) yield using imaging spectroscopy (hyperspectral imaging) in the visible to near-infrared (VNIR; 400–1000 nm) region via UAS. We aimed to solve the problem of crop yield modelling by identifying spectral features explaining yield and evaluating the best time period for accurate yield prediction, early in time. We introduced a Python library, named Jostar, for spectral feature selection. Embedded in Jostar, we proposed a new ranking method for selected features that reaches an agreement between multiple optimization models. Moreover, we implemented a well-known denoising algorithm for the spectral data used in this study. This study benefited from two years of remotely sensed data, captured at multiple instances over the summers of 2019 and 2020, with 24 plots and 18 plots, respectively. Two harvest stage models, early and late harvest, were assessed at two different locations in upstate New York, USA. Six varieties of snap bean were quantified using two components of yield, pod weight and seed length. We used two different vegetation detection algorithms. the Red-Edge Normalized Difference Vegetation Index (RENDVI) and Spectral Angle Mapper (SAM), to subset the fields into vegetation vs. non-vegetation pixels. Partial least squares regression (PLSR) was used as the regression model. Among nine different optimization models embedded in Jostar, we selected the Genetic Algorithm (GA), Ant Colony Optimization (ACO), Simulated Annealing (SA), and Particle Swarm Optimization (PSO) and their resulting joint ranking. The findings show that pod weight can be explained with a high coefficient of determination (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>R</mi><mn>2</mn></msup></semantics></math></inline-formula> = 0.78–0.93) and low root-mean-square error (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>R</mi><mi>M</mi><mi>S</mi><mi>E</mi></mrow></semantics></math></inline-formula> = 940–1369 kg/ha) for two years of data. Seed length yield assessment resulted in higher accuracies (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>R</mi><mn>2</mn></msup></semantics></math></inline-formula> = 0.83–0.98) and lower errors (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>R</mi><mi>M</mi><mi>S</mi><mi>E</mi></mrow></semantics></math></inline-formula> = 4.245–6.018 mm). Among optimization models used, ACO and SA outperformed others and the SAM vegetation detection approach showed improved results when compared to the RENDVI approach when dense canopies were being examined. Wavelengths at 450, 500, 520, 650, 700, and 760 nm, were identified in almost all data sets and harvest stage models used. The period between 44–55 days after planting (DAP) the optimal time period for yield assessment. Future work should involve transferring the learned concepts to a multispectral system, for eventual operational use; further attention should also be paid to seed length as a ground truth data collection technique, since this yield indicator is far more rapid and straightforward. |
first_indexed | 2024-03-10T08:25:42Z |
format | Article |
id | doaj.art-bd2e5fbe1daf4bb2b718ecd13aa2002b |
institution | Directory Open Access Journal |
issn | 2072-4292 |
language | English |
last_indexed | 2024-03-10T08:25:42Z |
publishDate | 2021-08-01 |
publisher | MDPI AG |
record_format | Article |
series | Remote Sensing |
spelling | doaj.art-bd2e5fbe1daf4bb2b718ecd13aa2002b2023-11-22T09:34:23ZengMDPI AGRemote Sensing2072-42922021-08-011316324110.3390/rs13163241Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap BeanAmirhossein Hassanzadeh0Fei Zhang1Jan van Aardt2Sean P. Murphy3Sarah J. Pethybridge4Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY 14623, USAChester F. Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY 14623, USAChester F. Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY 14623, USAPlant Pathology and Plant-Microbe Biology Section, School of Integrative Plant Science, Cornell AgriTech at The New York State Agricultural Experiment Station, Cornell University, Geneva, NY 14456, USAPlant Pathology and Plant-Microbe Biology Section, School of Integrative Plant Science, Cornell AgriTech at The New York State Agricultural Experiment Station, Cornell University, Geneva, NY 14456, USAAccurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse study, to low-altitude unmanned aerial systems (UAS). Our principal objective was to investigate snap bean crop (<i>Phaseolus vulgaris</i>) yield using imaging spectroscopy (hyperspectral imaging) in the visible to near-infrared (VNIR; 400–1000 nm) region via UAS. We aimed to solve the problem of crop yield modelling by identifying spectral features explaining yield and evaluating the best time period for accurate yield prediction, early in time. We introduced a Python library, named Jostar, for spectral feature selection. Embedded in Jostar, we proposed a new ranking method for selected features that reaches an agreement between multiple optimization models. Moreover, we implemented a well-known denoising algorithm for the spectral data used in this study. This study benefited from two years of remotely sensed data, captured at multiple instances over the summers of 2019 and 2020, with 24 plots and 18 plots, respectively. Two harvest stage models, early and late harvest, were assessed at two different locations in upstate New York, USA. Six varieties of snap bean were quantified using two components of yield, pod weight and seed length. We used two different vegetation detection algorithms. the Red-Edge Normalized Difference Vegetation Index (RENDVI) and Spectral Angle Mapper (SAM), to subset the fields into vegetation vs. non-vegetation pixels. Partial least squares regression (PLSR) was used as the regression model. Among nine different optimization models embedded in Jostar, we selected the Genetic Algorithm (GA), Ant Colony Optimization (ACO), Simulated Annealing (SA), and Particle Swarm Optimization (PSO) and their resulting joint ranking. The findings show that pod weight can be explained with a high coefficient of determination (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>R</mi><mn>2</mn></msup></semantics></math></inline-formula> = 0.78–0.93) and low root-mean-square error (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>R</mi><mi>M</mi><mi>S</mi><mi>E</mi></mrow></semantics></math></inline-formula> = 940–1369 kg/ha) for two years of data. Seed length yield assessment resulted in higher accuracies (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>R</mi><mn>2</mn></msup></semantics></math></inline-formula> = 0.83–0.98) and lower errors (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>R</mi><mi>M</mi><mi>S</mi><mi>E</mi></mrow></semantics></math></inline-formula> = 4.245–6.018 mm). Among optimization models used, ACO and SA outperformed others and the SAM vegetation detection approach showed improved results when compared to the RENDVI approach when dense canopies were being examined. Wavelengths at 450, 500, 520, 650, 700, and 760 nm, were identified in almost all data sets and harvest stage models used. The period between 44–55 days after planting (DAP) the optimal time period for yield assessment. Future work should involve transferring the learned concepts to a multispectral system, for eventual operational use; further attention should also be paid to seed length as a ground truth data collection technique, since this yield indicator is far more rapid and straightforward.https://www.mdpi.com/2072-4292/13/16/3241feature selectionhyperspectral imagingmachine learningsnap beanunmanned aerial vehicleyield modelling |
spellingShingle | Amirhossein Hassanzadeh Fei Zhang Jan van Aardt Sean P. Murphy Sarah J. Pethybridge Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean Remote Sensing feature selection hyperspectral imaging machine learning snap bean unmanned aerial vehicle yield modelling |
title | Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean |
title_full | Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean |
title_fullStr | Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean |
title_full_unstemmed | Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean |
title_short | Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean |
title_sort | broadacre crop yield estimation using imaging spectroscopy from unmanned aerial systems uas a field based case study with snap bean |
topic | feature selection hyperspectral imaging machine learning snap bean unmanned aerial vehicle yield modelling |
url | https://www.mdpi.com/2072-4292/13/16/3241 |
work_keys_str_mv | AT amirhosseinhassanzadeh broadacrecropyieldestimationusingimagingspectroscopyfromunmannedaerialsystemsuasafieldbasedcasestudywithsnapbean AT feizhang broadacrecropyieldestimationusingimagingspectroscopyfromunmannedaerialsystemsuasafieldbasedcasestudywithsnapbean AT janvanaardt broadacrecropyieldestimationusingimagingspectroscopyfromunmannedaerialsystemsuasafieldbasedcasestudywithsnapbean AT seanpmurphy broadacrecropyieldestimationusingimagingspectroscopyfromunmannedaerialsystemsuasafieldbasedcasestudywithsnapbean AT sarahjpethybridge broadacrecropyieldestimationusingimagingspectroscopyfromunmannedaerialsystemsuasafieldbasedcasestudywithsnapbean |