Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments

In recent years, a rise in interest in using Unmanned Aerial Vehicles (UAV) with LiDAR (Light Detection and Ranging) to capture the 3D structure of forests for forestry and ecosystem monitoring applications has been witnessed. Since the terrain is an essential basis for the vertical structure modeli...

Full description

Bibliographic Details
Main Authors: Bowen Li, Hao Lu, Han Wang, Jianbo Qi, Gang Yang, Yong Pang, Haolin Dong, Yining Lian
Format: Article
Language:English
Published: MDPI AG 2022-11-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/14/22/5798
_version_ 1797464056697716736
author Bowen Li
Hao Lu
Han Wang
Jianbo Qi
Gang Yang
Yong Pang
Haolin Dong
Yining Lian
author_facet Bowen Li
Hao Lu
Han Wang
Jianbo Qi
Gang Yang
Yong Pang
Haolin Dong
Yining Lian
author_sort Bowen Li
collection DOAJ
description In recent years, a rise in interest in using Unmanned Aerial Vehicles (UAV) with LiDAR (Light Detection and Ranging) to capture the 3D structure of forests for forestry and ecosystem monitoring applications has been witnessed. Since the terrain is an essential basis for the vertical structure modeling of a forest, the point cloud filtering delivering a highly accurate Digital Terrain Model (DTM) contributes significantly to forest studies. Conventional point cloud filtering algorithms require users to select suitable parameters according to the knowledge of the algorithm and the characteristics of scanned scenes, which are normally empirical and time-consuming. Deep learning offers a novel method in classifying and segmenting LiDAR point cloud, while there are only few studies reported on utilizing deep learning to filter non-ground LiDAR points of forested environments. In this study, we proposed an end-to-end and highly-efficient network named Terrain-net which combines the 3D point convolution operator and self-attention mechanism to capture local and global features for UAV point cloud ground filtering. The network was trained with over 15 million labeled points of 70 forest sites and was evaluated at 17 sites covering various forested environments. Terrain-net was compared with four classical filtering algorithms and one of the most well-recognized point convolution-based deep learning methods (KP-FCNN). Results indicated that Terrain-net achieved the best performance in respect of the Kappa coefficient (0.93), MIoU (0.933) and overall accuracy (98.0%). Terrain-net also performed well in transferring to an additional third-party open dataset for ground filtering in large-scale scenes and other vegetated environments. No parameters need to be tuned in transferring predictions. Terrain-net will hopefully be widely applied as a new highly-efficient, parameter-free, and easy-to-use tool for LiDAR data ground filtering in varying forest environments.
first_indexed 2024-03-09T18:02:30Z
format Article
id doaj.art-ef0778a27dcf48adbf4676b290597c72
institution Directory Open Access Journal
issn 2072-4292
language English
last_indexed 2024-03-09T18:02:30Z
publishDate 2022-11-01
publisher MDPI AG
record_format Article
series Remote Sensing
spelling doaj.art-ef0778a27dcf48adbf4676b290597c722023-11-24T09:50:36ZengMDPI AGRemote Sensing2072-42922022-11-011422579810.3390/rs14225798Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested EnvironmentsBowen Li0Hao Lu1Han Wang2Jianbo Qi3Gang Yang4Yong Pang5Haolin Dong6Yining Lian7School of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaResearch Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaInstitute of Forest Resource Information Techniques, Chinese Academy of Forestry, Beijing 100091, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaIn recent years, a rise in interest in using Unmanned Aerial Vehicles (UAV) with LiDAR (Light Detection and Ranging) to capture the 3D structure of forests for forestry and ecosystem monitoring applications has been witnessed. Since the terrain is an essential basis for the vertical structure modeling of a forest, the point cloud filtering delivering a highly accurate Digital Terrain Model (DTM) contributes significantly to forest studies. Conventional point cloud filtering algorithms require users to select suitable parameters according to the knowledge of the algorithm and the characteristics of scanned scenes, which are normally empirical and time-consuming. Deep learning offers a novel method in classifying and segmenting LiDAR point cloud, while there are only few studies reported on utilizing deep learning to filter non-ground LiDAR points of forested environments. In this study, we proposed an end-to-end and highly-efficient network named Terrain-net which combines the 3D point convolution operator and self-attention mechanism to capture local and global features for UAV point cloud ground filtering. The network was trained with over 15 million labeled points of 70 forest sites and was evaluated at 17 sites covering various forested environments. Terrain-net was compared with four classical filtering algorithms and one of the most well-recognized point convolution-based deep learning methods (KP-FCNN). Results indicated that Terrain-net achieved the best performance in respect of the Kappa coefficient (0.93), MIoU (0.933) and overall accuracy (98.0%). Terrain-net also performed well in transferring to an additional third-party open dataset for ground filtering in large-scale scenes and other vegetated environments. No parameters need to be tuned in transferring predictions. Terrain-net will hopefully be widely applied as a new highly-efficient, parameter-free, and easy-to-use tool for LiDAR data ground filtering in varying forest environments.https://www.mdpi.com/2072-4292/14/22/5798UAVLiDARground filteringdeep learningforestry
spellingShingle Bowen Li
Hao Lu
Han Wang
Jianbo Qi
Gang Yang
Yong Pang
Haolin Dong
Yining Lian
Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
Remote Sensing
UAV
LiDAR
ground filtering
deep learning
forestry
title Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
title_full Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
title_fullStr Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
title_full_unstemmed Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
title_short Terrain-Net: A Highly-Efficient, Parameter-Free, and Easy-to-Use Deep Neural Network for Ground Filtering of UAV LiDAR Data in Forested Environments
title_sort terrain net a highly efficient parameter free and easy to use deep neural network for ground filtering of uav lidar data in forested environments
topic UAV
LiDAR
ground filtering
deep learning
forestry
url https://www.mdpi.com/2072-4292/14/22/5798
work_keys_str_mv AT bowenli terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT haolu terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT hanwang terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT jianboqi terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT gangyang terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT yongpang terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT haolindong terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments
AT yininglian terrainnetahighlyefficientparameterfreeandeasytousedeepneuralnetworkforgroundfilteringofuavlidardatainforestedenvironments