Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping
Unmanned aerial vehicle-based remote sensing technology has recently been widely applied to crop monitoring due to the rapid development of unmanned aerial vehicles, and these technologies have considerable potential in smart agriculture applications. Field phenotyping using remote sensing is mostly...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-02-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/22/4/1423 |
_version_ | 1797476754219073536 |
---|---|
author | Hyeon-Seung Lee Beom-Soo Shin J. Alex Thomasson Tianyi Wang Zhao Zhang Xiongzhe Han |
author_facet | Hyeon-Seung Lee Beom-Soo Shin J. Alex Thomasson Tianyi Wang Zhao Zhang Xiongzhe Han |
author_sort | Hyeon-Seung Lee |
collection | DOAJ |
description | Unmanned aerial vehicle-based remote sensing technology has recently been widely applied to crop monitoring due to the rapid development of unmanned aerial vehicles, and these technologies have considerable potential in smart agriculture applications. Field phenotyping using remote sensing is mostly performed using unmanned aerial vehicles equipped with RGB cameras or multispectral cameras. For accurate field phenotyping for precision agriculture, images taken from multiple perspectives need to be simultaneously collected, and phenotypic measurement errors may occur due to the movement of the drone and plants during flight. In this study, to minimize measurement error and improve the digital surface model, we proposed a collaborative driving system that allows multiple UAVs to simultaneously acquire images from different viewpoints. An integrated navigation system based on MAVSDK is configured for the attitude control and position control of unmanned aerial vehicles. Based on the leader–follower-based swarm driving algorithm and a long-range wireless network system, the follower drone cooperates with the leader drone to maintain a constant speed, direction, and image overlap ratio, and to maintain a rank to improve their phenotyping. A collision avoidance algorithm was developed because different UAVs can collide due to external disturbance (wind) when driving in groups while maintaining a rank. To verify and optimize the flight algorithm developed in this study in a virtual environment, a GAZEBO-based simulation environment was established. Based on the algorithm that has been verified and optimized in the previous simulation environment, some unmanned aerial vehicles were flown in the same flight path in a real field, and the simulation and the real field were compared. As a result of the comparative experiment, the simulated flight accuracy (RMSE) was 0.36 m and the actual field flight accuracy was 0.46 m, showing flight accuracy like that of a commercial program. |
first_indexed | 2024-03-09T21:07:10Z |
format | Article |
id | doaj.art-6d25d7124fab4da8a40a057669db5184 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-09T21:07:10Z |
publishDate | 2022-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-6d25d7124fab4da8a40a057669db51842023-11-23T21:59:16ZengMDPI AGSensors1424-82202022-02-01224142310.3390/s22041423Development of Multiple UAV Collaborative Driving Systems for Improving Field PhenotypingHyeon-Seung Lee0Beom-Soo Shin1J. Alex Thomasson2Tianyi Wang3Zhao Zhang4Xiongzhe Han5Department of Biosystems Engineering, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, KoreaDepartment of Biosystems Engineering, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, KoreaDepartment of Agricultural and Biological Engineering, Mississippi State University, Starkville, MS 39762, USACollege of Engineering, China Agricultural University, Beijing 100083, ChinaKey Laboratory of Smart Agriculture System Integration, Ministry of Education, China Agricultural University, Beijing 100083, ChinaDepartment of Biosystems Engineering, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, KoreaUnmanned aerial vehicle-based remote sensing technology has recently been widely applied to crop monitoring due to the rapid development of unmanned aerial vehicles, and these technologies have considerable potential in smart agriculture applications. Field phenotyping using remote sensing is mostly performed using unmanned aerial vehicles equipped with RGB cameras or multispectral cameras. For accurate field phenotyping for precision agriculture, images taken from multiple perspectives need to be simultaneously collected, and phenotypic measurement errors may occur due to the movement of the drone and plants during flight. In this study, to minimize measurement error and improve the digital surface model, we proposed a collaborative driving system that allows multiple UAVs to simultaneously acquire images from different viewpoints. An integrated navigation system based on MAVSDK is configured for the attitude control and position control of unmanned aerial vehicles. Based on the leader–follower-based swarm driving algorithm and a long-range wireless network system, the follower drone cooperates with the leader drone to maintain a constant speed, direction, and image overlap ratio, and to maintain a rank to improve their phenotyping. A collision avoidance algorithm was developed because different UAVs can collide due to external disturbance (wind) when driving in groups while maintaining a rank. To verify and optimize the flight algorithm developed in this study in a virtual environment, a GAZEBO-based simulation environment was established. Based on the algorithm that has been verified and optimized in the previous simulation environment, some unmanned aerial vehicles were flown in the same flight path in a real field, and the simulation and the real field were compared. As a result of the comparative experiment, the simulated flight accuracy (RMSE) was 0.36 m and the actual field flight accuracy was 0.46 m, showing flight accuracy like that of a commercial program.https://www.mdpi.com/1424-8220/22/4/1423multiple UAVsremote sensingcollaborative drivingfield phenotypingsynchronized motion |
spellingShingle | Hyeon-Seung Lee Beom-Soo Shin J. Alex Thomasson Tianyi Wang Zhao Zhang Xiongzhe Han Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping Sensors multiple UAVs remote sensing collaborative driving field phenotyping synchronized motion |
title | Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping |
title_full | Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping |
title_fullStr | Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping |
title_full_unstemmed | Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping |
title_short | Development of Multiple UAV Collaborative Driving Systems for Improving Field Phenotyping |
title_sort | development of multiple uav collaborative driving systems for improving field phenotyping |
topic | multiple UAVs remote sensing collaborative driving field phenotyping synchronized motion |
url | https://www.mdpi.com/1424-8220/22/4/1423 |
work_keys_str_mv | AT hyeonseunglee developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping AT beomsooshin developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping AT jalexthomasson developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping AT tianyiwang developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping AT zhaozhang developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping AT xiongzhehan developmentofmultipleuavcollaborativedrivingsystemsforimprovingfieldphenotyping |