Fully-automated root image analysis (faRIA)

Abstract High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root ima...

Full description

Bibliographic Details
Main Authors: Narendra Narisetti, Michael Henke, Christiane Seiler, Astrid Junker, Jörn Ostermann, Thomas Altmann, Evgeny Gladilin
Format: Article
Language:English
Published: Nature Portfolio 2021-08-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-021-95480-y
_version_ 1819198705851432960
author Narendra Narisetti
Michael Henke
Christiane Seiler
Astrid Junker
Jörn Ostermann
Thomas Altmann
Evgeny Gladilin
author_facet Narendra Narisetti
Michael Henke
Christiane Seiler
Astrid Junker
Jörn Ostermann
Thomas Altmann
Evgeny Gladilin
author_sort Narendra Narisetti
collection DOAJ
description Abstract High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.
first_indexed 2024-12-23T03:04:42Z
format Article
id doaj.art-a4060d5db9134350afa66d6cf3ddc167
institution Directory Open Access Journal
issn 2045-2322
language English
last_indexed 2024-12-23T03:04:42Z
publishDate 2021-08-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj.art-a4060d5db9134350afa66d6cf3ddc1672022-12-21T18:02:21ZengNature PortfolioScientific Reports2045-23222021-08-0111111510.1038/s41598-021-95480-yFully-automated root image analysis (faRIA)Narendra Narisetti0Michael Henke1Christiane Seiler2Astrid Junker3Jörn Ostermann4Thomas Altmann5Evgeny Gladilin6Leibniz Institute of Plant Genetics and Crop Plant ResearchLeibniz Institute of Plant Genetics and Crop Plant ResearchLeibniz Institute of Plant Genetics and Crop Plant ResearchLeibniz Institute of Plant Genetics and Crop Plant ResearchInstitute for Information Processing (TNT), Leibniz University of HannoverLeibniz Institute of Plant Genetics and Crop Plant ResearchLeibniz Institute of Plant Genetics and Crop Plant ResearchAbstract High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.https://doi.org/10.1038/s41598-021-95480-y
spellingShingle Narendra Narisetti
Michael Henke
Christiane Seiler
Astrid Junker
Jörn Ostermann
Thomas Altmann
Evgeny Gladilin
Fully-automated root image analysis (faRIA)
Scientific Reports
title Fully-automated root image analysis (faRIA)
title_full Fully-automated root image analysis (faRIA)
title_fullStr Fully-automated root image analysis (faRIA)
title_full_unstemmed Fully-automated root image analysis (faRIA)
title_short Fully-automated root image analysis (faRIA)
title_sort fully automated root image analysis faria
url https://doi.org/10.1038/s41598-021-95480-y
work_keys_str_mv AT narendranarisetti fullyautomatedrootimageanalysisfaria
AT michaelhenke fullyautomatedrootimageanalysisfaria
AT christianeseiler fullyautomatedrootimageanalysisfaria
AT astridjunker fullyautomatedrootimageanalysisfaria
AT jornostermann fullyautomatedrootimageanalysisfaria
AT thomasaltmann fullyautomatedrootimageanalysisfaria
AT evgenygladilin fullyautomatedrootimageanalysisfaria