Automatic Stitching for Hyperspectral Images Using Robust Feature Matching and Elastic Warp

Hyperspectral images, which contain not only spatial information but also rich spectral information, have been extensively applied to the fields of agriculture, urban planning, etc. However, it is difficult for a single image to cover a large area. Therefore, it requires to take photos of various pa...

Full description

Bibliographic Details
Main Authors: Yujie Zhang, Zhiying Wan, Xingyu Jiang, Xiaoguang Mei
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9112681/
Description
Summary:Hyperspectral images, which contain not only spatial information but also rich spectral information, have been extensively applied to the fields of agriculture, urban planning, etc. However, it is difficult for a single image to cover a large area. Therefore, it requires to take photos of various parts and apply image stitching technology to obtain a panoramic hyperspectral image. When the viewpoint of the scene changes a lot, the ghost issue will occur with traditional methods. In order to get the high-precision resultant panoramas, this article proposes an automatic image stitching algorithm for hyperspectral images using robust feature matching and elastic warp. Our method contains two stages. The first stage is to choose one band as reference band and obtain the panorama in a single band. In particular, we extract feature points by scale-invariant feature transform. Then, we propose an efficient algorithm called multiscale top K rank preservation algorithm, for establishing robust point correspondences between two sets of points. Next, we adopt robust elastic warp to obtain the panorama of each band. The second stage is to stitch all remaining bands based on the transformation obtained in the first stage and fuse the information of all bands together to get the final panoramic hyperspectral image. Extensive experiments have demonstrated the effectiveness of our proposed method.
ISSN:2151-1535