Plant Leaf Position Estimation with Computer Vision

Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared li...

Full description

Bibliographic Details
Main Authors: James Beadle, C. James Taylor, Kirsti Ashworth, David Cheneler
Format: Article
Language:English
Published: MDPI AG 2020-10-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/20/20/5933
_version_ 1797550417983307776
author James Beadle
C. James Taylor
Kirsti Ashworth
David Cheneler
author_facet James Beadle
C. James Taylor
Kirsti Ashworth
David Cheneler
author_sort James Beadle
collection DOAJ
description Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.
first_indexed 2024-03-10T15:28:59Z
format Article
id doaj.art-ce03b13292c24aa1aebb405eb8f73ce4
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-10T15:28:59Z
publishDate 2020-10-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-ce03b13292c24aa1aebb405eb8f73ce42023-11-20T17:52:13ZengMDPI AGSensors1424-82202020-10-012020593310.3390/s20205933Plant Leaf Position Estimation with Computer VisionJames Beadle0C. James Taylor1Kirsti Ashworth2David Cheneler3Engineering Department, Lancaster University, Lancaster LA1 4YW, UKEngineering Department, Lancaster University, Lancaster LA1 4YW, UKLancaster Environment Centre, Lancaster University, Lancaster LA1 4YW, UKEngineering Department, Lancaster University, Lancaster LA1 4YW, UKAutonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.https://www.mdpi.com/1424-8220/20/20/5933neural networkcomputer visiondepth estimationposition estimationparallax
spellingShingle James Beadle
C. James Taylor
Kirsti Ashworth
David Cheneler
Plant Leaf Position Estimation with Computer Vision
Sensors
neural network
computer vision
depth estimation
position estimation
parallax
title Plant Leaf Position Estimation with Computer Vision
title_full Plant Leaf Position Estimation with Computer Vision
title_fullStr Plant Leaf Position Estimation with Computer Vision
title_full_unstemmed Plant Leaf Position Estimation with Computer Vision
title_short Plant Leaf Position Estimation with Computer Vision
title_sort plant leaf position estimation with computer vision
topic neural network
computer vision
depth estimation
position estimation
parallax
url https://www.mdpi.com/1424-8220/20/20/5933
work_keys_str_mv AT jamesbeadle plantleafpositionestimationwithcomputervision
AT cjamestaylor plantleafpositionestimationwithcomputervision
AT kirstiashworth plantleafpositionestimationwithcomputervision
AT davidcheneler plantleafpositionestimationwithcomputervision