TrmGLU-Net: transformer-augmented global-local U-Net for hyperspectral image classification with limited training samples

ABSTRACTIn recent years, deep learning methods have been widely used for the classification of hyperspectral images. However, their limited availability under the condition of small samples remains a serious issue. Moreover, the current mainstream approaches based on convolutional neural networks do...

Full description

Bibliographic Details
Main Authors: Bing Liu, Yifan Sun, Ruirui Wang, Anzhu Yu, Zhixiang Xue, Yusong Wang
Format: Article
Language:English
Published: Taylor & Francis Group 2023-12-01
Series:European Journal of Remote Sensing
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/22797254.2023.2227993
Description
Summary:ABSTRACTIn recent years, deep learning methods have been widely used for the classification of hyperspectral images. However, their limited availability under the condition of small samples remains a serious issue. Moreover, the current mainstream approaches based on convolutional neural networks do well in local feature extraction but are also restricted by its limited receptive field. Hence, these models are unable to capture long-distance dependencies both on spatial and spectral dimension. To address above issues, this paper proposes a global-local U-Net augmented by transformers (TrmGLU-Net). First, whole hyperspectral images are input to the model for end-to-end training to capture the contextual information. Then, a transformer-augmented U-Net is designed with alternating transformers and convolutional layers to perceive both global and local information. Finally, a superpixel-based label expansion method is proposed to expand the labels and improve the performance under the condition of small samples. Extensive experiments on four hyperspectral scenes demonstrate that TrmGLU-Net has better performance than other advanced patch-level and image-level methods with limited training samples. The relevant code will be opened at https://github.com/sssssyf/TrmGLU-Net
ISSN:2279-7254