Localization in Unstructured Environments: Towards Autonomous Robots in Forests with Delaunay Triangulation

Autonomous harvesting and transportation is a long-term goal of the forest industry. One of the main challenges is the accurate localization of both vehicles and trees in a forest. Forests are unstructured environments where it is difficult to find a group of significant landmarks for current fast f...

Full description

Bibliographic Details
Main Authors: Qingqing Li, Paavo Nevalainen, Jorge Peña Queralta, Jukka Heikkonen, Tomi Westerlund
Format: Article
Language:English
Published: MDPI AG 2020-06-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/12/11/1870
Description
Summary:Autonomous harvesting and transportation is a long-term goal of the forest industry. One of the main challenges is the accurate localization of both vehicles and trees in a forest. Forests are unstructured environments where it is difficult to find a group of significant landmarks for current fast feature-based place recognition algorithms. This paper proposes a novel approach where local point clouds are matched to a global tree map using the Delaunay triangularization as the representation format. Instead of point cloud based matching methods, we utilize a topology-based method. First, tree trunk positions are registered at a prior run done by a forest harvester. Second, the resulting map is Delaunay triangularized. Third, a local submap of the autonomous robot is registered, triangularized and matched using triangular similarity maximization to estimate the position of the robot. We test our method on a dataset accumulated from a forestry site at Lieksa, Finland. A total length of 200 m of harvester path was recorded by an industrial harvester with a 3D laser scanner and a geolocation unit fixed to the frame. Our experiments show a 12 cm s.t.d. in the location accuracy and with real-time data processing for speeds not exceeding 0.5 m/s. The accuracy and speed limit are realistic during forest operations.
ISSN:2072-4292