Hierarchical local global transformer for point clouds analysis

Transformer networks have demonstrated remarkable performance in point cloud analysis. However, achieving a balance between local regional context and global long-range context learning remains a significant challenge. In this paper, we propose a Hierarchical Local Global Transformer Network (LGTNet...

Full description

Bibliographic Details
Main Authors: Dilong Li, Shenghong Zheng, Ziyi Chen, Xiang Li, Lanying Wang, Jixiang Du
Format: Article
Language:English
Published: Elsevier 2024-05-01
Series:International Journal of Applied Earth Observations and Geoinformation
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1569843224001675
Description
Summary:Transformer networks have demonstrated remarkable performance in point cloud analysis. However, achieving a balance between local regional context and global long-range context learning remains a significant challenge. In this paper, we propose a Hierarchical Local Global Transformer Network (LGTNet), designed to capture local and global contexts in a hierarchical manner. Specifically, we employ serial local and global Transformers to learn the inner-group and cross-group self-attention, respectively. Besides, we propose a geometric moment-based position encoding for local Transformer, enabling the embedding of comprehensive local geometric relationship. Additionally, we also introduce a global feature pooling module that extracts the global features from each encoder layers. Extensive experimental results demonstrate that LGTNet achieves state-of-the-art performance on ShapeNetPart and ScanObjectNN datasets. This approach effectively enhances the understanding of point cloud scenes, thereby facilitating the use of point cloud data in remote sensing applications.
ISSN:1569-8432