Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows

The relative position of the orchard robot to the rows of fruit trees is an important parameter for achieving autonomous navigation. The current methods for estimating the position parameters between rows of orchard robots obtain low parameter accuracy. To address this problem, this paper proposes a...

Full description

Bibliographic Details
Main Authors: Baoxing Gu, Qin Liu, Yi Gao, Guangzhao Tian, Baohua Zhang, Haiqing Wang, He Li
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/21/8807
_version_ 1797631289567739904
author Baoxing Gu
Qin Liu
Yi Gao
Guangzhao Tian
Baohua Zhang
Haiqing Wang
He Li
author_facet Baoxing Gu
Qin Liu
Yi Gao
Guangzhao Tian
Baohua Zhang
Haiqing Wang
He Li
author_sort Baoxing Gu
collection DOAJ
description The relative position of the orchard robot to the rows of fruit trees is an important parameter for achieving autonomous navigation. The current methods for estimating the position parameters between rows of orchard robots obtain low parameter accuracy. To address this problem, this paper proposes a machine vision-based method for detecting the relative position of orchard robots and fruit tree rows. First, the fruit tree trunk is identified based on the improved YOLOv4 model; second, the camera coordinates of the tree trunk are calculated using the principle of binocular camera triangulation, and the ground projection coordinates of the tree trunk are obtained through coordinate conversion; finally, the midpoints of the projection coordinates of different sides are combined, the navigation path is obtained by linear fitting with the least squares method, and the position parameters of the orchard robot are obtained through calculation. The experimental results show that the average accuracy and average recall rate of the improved YOLOv4 model for fruit tree trunk detection are 5.92% and 7.91% higher, respectively, than those of the original YOLOv4 model. The average errors of heading angle and lateral deviation estimates obtained based on the method in this paper are 0.57° and 0.02 m. The method can accurately calculate heading angle and lateral deviation values at different positions between rows and provide a reference for the autonomous visual navigation of orchard robots.
first_indexed 2024-03-11T11:21:43Z
format Article
id doaj.art-c89d4311b2f14214af8935d7be3d94fa
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-11T11:21:43Z
publishDate 2023-10-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-c89d4311b2f14214af8935d7be3d94fa2023-11-10T15:12:09ZengMDPI AGSensors1424-82202023-10-012321880710.3390/s23218807Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree RowsBaoxing Gu0Qin Liu1Yi Gao2Guangzhao Tian3Baohua Zhang4Haiqing Wang5He Li6College of Engineering, Nanjing Agricultural University, Nanjing 210031, ChinaSchool of Cyber Science and Engineering, Southeast University, Nanjing 210096, ChinaCollege of Engineering, Nanjing Agricultural University, Nanjing 210031, ChinaCollege of Engineering, Nanjing Agricultural University, Nanjing 210031, ChinaCollege of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, ChinaCollege of Engineering, Nanjing Agricultural University, Nanjing 210031, ChinaCollege of Engineering, Nanjing Agricultural University, Nanjing 210031, ChinaThe relative position of the orchard robot to the rows of fruit trees is an important parameter for achieving autonomous navigation. The current methods for estimating the position parameters between rows of orchard robots obtain low parameter accuracy. To address this problem, this paper proposes a machine vision-based method for detecting the relative position of orchard robots and fruit tree rows. First, the fruit tree trunk is identified based on the improved YOLOv4 model; second, the camera coordinates of the tree trunk are calculated using the principle of binocular camera triangulation, and the ground projection coordinates of the tree trunk are obtained through coordinate conversion; finally, the midpoints of the projection coordinates of different sides are combined, the navigation path is obtained by linear fitting with the least squares method, and the position parameters of the orchard robot are obtained through calculation. The experimental results show that the average accuracy and average recall rate of the improved YOLOv4 model for fruit tree trunk detection are 5.92% and 7.91% higher, respectively, than those of the original YOLOv4 model. The average errors of heading angle and lateral deviation estimates obtained based on the method in this paper are 0.57° and 0.02 m. The method can accurately calculate heading angle and lateral deviation values at different positions between rows and provide a reference for the autonomous visual navigation of orchard robots.https://www.mdpi.com/1424-8220/23/21/8807orchard robotautonomous navigationpositional parametersmachine visionYOLO
spellingShingle Baoxing Gu
Qin Liu
Yi Gao
Guangzhao Tian
Baohua Zhang
Haiqing Wang
He Li
Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
Sensors
orchard robot
autonomous navigation
positional parameters
machine vision
YOLO
title Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
title_full Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
title_fullStr Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
title_full_unstemmed Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
title_short Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows
title_sort research on the relative position detection method between orchard robots and fruit tree rows
topic orchard robot
autonomous navigation
positional parameters
machine vision
YOLO
url https://www.mdpi.com/1424-8220/23/21/8807
work_keys_str_mv AT baoxinggu researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT qinliu researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT yigao researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT guangzhaotian researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT baohuazhang researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT haiqingwang researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows
AT heli researchontherelativepositiondetectionmethodbetweenorchardrobotsandfruittreerows