ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles

Autonomous navigation in unstructured vegetated environments remains an open challenge. To successfully operate in these settings, autonomous ground vehicles (AGVs) must assess the environment and determine which vegetation is pliable enough to safely traverse. In this paper, we propose ForestTrav (...

Full description

Bibliographic Details
Main Authors: Fabio A. Ruetz, Nicholas Lawrance, Emili Hernandez, Paulo V. K. Borges, Thierry Peynot
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10458917/
_version_ 1827308221866442752
author Fabio A. Ruetz
Nicholas Lawrance
Emili Hernandez
Paulo V. K. Borges
Thierry Peynot
author_facet Fabio A. Ruetz
Nicholas Lawrance
Emili Hernandez
Paulo V. K. Borges
Thierry Peynot
author_sort Fabio A. Ruetz
collection DOAJ
description Autonomous navigation in unstructured vegetated environments remains an open challenge. To successfully operate in these settings, autonomous ground vehicles (AGVs) must assess the environment and determine which vegetation is pliable enough to safely traverse. In this paper, we propose ForestTrav (Forest Traversability): a novel lidar-only (geometric), online traversability estimation (TE) method that can accurately generate a per-voxel traversability estimate for densely vegetated environments, demonstrated in dense subtropical forests. The method leverages a salient, probabilistic 3D voxel representation, continuously fusing incoming lidar measurements to maintain multiple, per-voxel ray statistics, in combination with the structural context and compactness of sparse convolutional neural networks (SCNNs) to perform accurate TE in densely vegetated environments. The proposed method is real-time capable and is shown to outperform state-of-the-art volumetric and 2.5D TE methods by a significant margin (0.62 vs. 0.41 Matthews correlation coefficient (MCC) score at qty 0.1 m voxel resolution) in challenging scenes and to generalize to unseen environments. ForestTrav demonstrates that lidar-only (geometric) methods can provide accurate, online TE in complex, densely-vegetated environments. This capability has not been previously demonstrated in the literature in such complex environments. Further, we analyze the response of the TE methods to the temporal and spatial evolution of the probabilistic map as a function of information accumulated over time during scene exploration. It shows that our method performs well even with limited information in the early stages of exploration, and this provides an additional tool to assess the expected performance during deployment. Finally, to train and assess TE methods in highly-vegetated environments, we collected and labeled a novel, real-world data set and provide it to the community as an open-source resource.
first_indexed 2024-04-24T18:53:40Z
format Article
id doaj.art-0cbe0831042f46d18f312338d86e04b1
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-24T18:53:40Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-0cbe0831042f46d18f312338d86e04b12024-03-26T17:45:24ZengIEEEIEEE Access2169-35362024-01-0112371923720610.1109/ACCESS.2024.337300410458917ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground VehiclesFabio A. Ruetz0https://orcid.org/0000-0002-3277-3134Nicholas Lawrance1https://orcid.org/0000-0003-2167-7427Emili Hernandez2https://orcid.org/0000-0002-6143-6161Paulo V. K. Borges3https://orcid.org/0000-0001-8137-7245Thierry Peynot4https://orcid.org/0000-0001-8275-6538QUT Centre for Robotics, Queensland University of Technology (QUT), Brisbane, QLD, AustraliaData61, CSIRO Robotics, Pullenvale, QLD, AustraliaEmesent, Milton, QLD, AustraliaData61, CSIRO Robotics, Pullenvale, QLD, AustraliaQUT Centre for Robotics, Queensland University of Technology (QUT), Brisbane, QLD, AustraliaAutonomous navigation in unstructured vegetated environments remains an open challenge. To successfully operate in these settings, autonomous ground vehicles (AGVs) must assess the environment and determine which vegetation is pliable enough to safely traverse. In this paper, we propose ForestTrav (Forest Traversability): a novel lidar-only (geometric), online traversability estimation (TE) method that can accurately generate a per-voxel traversability estimate for densely vegetated environments, demonstrated in dense subtropical forests. The method leverages a salient, probabilistic 3D voxel representation, continuously fusing incoming lidar measurements to maintain multiple, per-voxel ray statistics, in combination with the structural context and compactness of sparse convolutional neural networks (SCNNs) to perform accurate TE in densely vegetated environments. The proposed method is real-time capable and is shown to outperform state-of-the-art volumetric and 2.5D TE methods by a significant margin (0.62 vs. 0.41 Matthews correlation coefficient (MCC) score at qty 0.1 m voxel resolution) in challenging scenes and to generalize to unseen environments. ForestTrav demonstrates that lidar-only (geometric) methods can provide accurate, online TE in complex, densely-vegetated environments. This capability has not been previously demonstrated in the literature in such complex environments. Further, we analyze the response of the TE methods to the temporal and spatial evolution of the probabilistic map as a function of information accumulated over time during scene exploration. It shows that our method performs well even with limited information in the early stages of exploration, and this provides an additional tool to assess the expected performance during deployment. Finally, to train and assess TE methods in highly-vegetated environments, we collected and labeled a novel, real-world data set and provide it to the community as an open-source resource.https://ieeexplore.ieee.org/document/10458917/Autonomous ground vehiclesfield roboticsmobile robotsLiDARrobot learningtraversability estimation
spellingShingle Fabio A. Ruetz
Nicholas Lawrance
Emili Hernandez
Paulo V. K. Borges
Thierry Peynot
ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
IEEE Access
Autonomous ground vehicles
field robotics
mobile robots
LiDAR
robot learning
traversability estimation
title ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
title_full ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
title_fullStr ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
title_full_unstemmed ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
title_short ForestTrav: 3D LiDAR-Only Forest Traversability Estimation for Autonomous Ground Vehicles
title_sort foresttrav 3d lidar only forest traversability estimation for autonomous ground vehicles
topic Autonomous ground vehicles
field robotics
mobile robots
LiDAR
robot learning
traversability estimation
url https://ieeexplore.ieee.org/document/10458917/
work_keys_str_mv AT fabioaruetz foresttrav3dlidaronlyforesttraversabilityestimationforautonomousgroundvehicles
AT nicholaslawrance foresttrav3dlidaronlyforesttraversabilityestimationforautonomousgroundvehicles
AT emilihernandez foresttrav3dlidaronlyforesttraversabilityestimationforautonomousgroundvehicles
AT paulovkborges foresttrav3dlidaronlyforesttraversabilityestimationforautonomousgroundvehicles
AT thierrypeynot foresttrav3dlidaronlyforesttraversabilityestimationforautonomousgroundvehicles