Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing

The block-term tensor decomposition model with multilinear rank-<inline-formula><tex-math notation="LaTeX">$(L_{r},L_{r},1)$</tex-math></inline-formula> terms (or the &#x201C;<inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</t...

Full description

Bibliographic Details
Main Authors: Meng Ding, Xiao Fu, Xi-Le Zhao
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10022317/
_version_ 1811168714063609856
author Meng Ding
Xiao Fu
Xi-Le Zhao
author_facet Meng Ding
Xiao Fu
Xi-Le Zhao
author_sort Meng Ding
collection DOAJ
description The block-term tensor decomposition model with multilinear rank-<inline-formula><tex-math notation="LaTeX">$(L_{r},L_{r},1)$</tex-math></inline-formula> terms (or the &#x201C;<inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula> tensor decomposition&#x201D; in short) offers a valuable alternative formulation for <italic>hyperspectral unmixing</italic> (HU), which ensures the identifiability of the endmembers/abundances in cases where classic matrix factorization (MF) approaches cannot provide such guarantees. However, the existing <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula>-tensor-decomposition-based HU algorithms use a three-factor parameterization of the tensor (i.e., the hyperspectral image cube), which causes difficulties in incorporating structural prior information arising in HU. Consequently, their algorithms often exhibit high per-iteration complexity and slow convergence. This article focuses on <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula> tensor decomposition under structural constraints and regularization terms in HU. Our algorithm uses a two-factor reparameterization of the tensor model. Like in the MF-based approaches, the factors correspond to the endmembers and abundances in the context of HU. Thus, the proposed framework is natural to incorporate physics-motivated priors in HU. To tackle the formulated optimization problem, a two-block alternating <italic>gradient projection</italic> (GP)-based algorithm is proposed. Carefully designed projection solvers are proposed to implement the GP algorithm with a relatively low per-iteration complexity. An extrapolation-based acceleration strategy is proposed to expedite the GP algorithm. Such an extrapolated multiblock algorithm only had asymptotic convergence assurances in the literature. Our analysis shows that the algorithm converges to the vicinity of a stationary point within finite iterations, under reasonable conditions. Empirical study shows that the proposed algorithm often attains orders-of-magnitude speedup and substantial HU performance gains compared with the existing <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula>-decomposition-based HU algorithms.
first_indexed 2024-04-10T16:31:23Z
format Article
id doaj.art-7c6f90a000fa49efa82a3e4da45000bc
institution Directory Open Access Journal
issn 2151-1535
language English
last_indexed 2024-04-10T16:31:23Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
spelling doaj.art-7c6f90a000fa49efa82a3e4da45000bc2023-02-09T00:00:14ZengIEEEIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing2151-15352023-01-01161691170910.1109/JSTARS.2023.323865310022317Fast and Structured Block-Term Tensor Decomposition for Hyperspectral UnmixingMeng Ding0https://orcid.org/0000-0002-8670-0846Xiao Fu1https://orcid.org/0000-0003-4847-9586Xi-Le Zhao2https://orcid.org/0000-0002-6540-946XSchool of Mathematics, Southwest Jiaotong University, Chengdu, ChinaSchool of Electrical Engineering and Computer Science, Oregon State University, Corvallis, OR, USASchool of Mathematical Sciences, University of Electronic Science and Technology of China, Chengdu, ChinaThe block-term tensor decomposition model with multilinear rank-<inline-formula><tex-math notation="LaTeX">$(L_{r},L_{r},1)$</tex-math></inline-formula> terms (or the &#x201C;<inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula> tensor decomposition&#x201D; in short) offers a valuable alternative formulation for <italic>hyperspectral unmixing</italic> (HU), which ensures the identifiability of the endmembers/abundances in cases where classic matrix factorization (MF) approaches cannot provide such guarantees. However, the existing <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula>-tensor-decomposition-based HU algorithms use a three-factor parameterization of the tensor (i.e., the hyperspectral image cube), which causes difficulties in incorporating structural prior information arising in HU. Consequently, their algorithms often exhibit high per-iteration complexity and slow convergence. This article focuses on <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula> tensor decomposition under structural constraints and regularization terms in HU. Our algorithm uses a two-factor reparameterization of the tensor model. Like in the MF-based approaches, the factors correspond to the endmembers and abundances in the context of HU. Thus, the proposed framework is natural to incorporate physics-motivated priors in HU. To tackle the formulated optimization problem, a two-block alternating <italic>gradient projection</italic> (GP)-based algorithm is proposed. Carefully designed projection solvers are proposed to implement the GP algorithm with a relatively low per-iteration complexity. An extrapolation-based acceleration strategy is proposed to expedite the GP algorithm. Such an extrapolated multiblock algorithm only had asymptotic convergence assurances in the literature. Our analysis shows that the algorithm converges to the vicinity of a stationary point within finite iterations, under reasonable conditions. Empirical study shows that the proposed algorithm often attains orders-of-magnitude speedup and substantial HU performance gains compared with the existing <inline-formula><tex-math notation="LaTeX">${\mathsf{LL1}}$</tex-math></inline-formula>-decomposition-based HU algorithms.https://ieeexplore.ieee.org/document/10022317/Hyperspectral unmixingstructured block-term tensor decompositionalternating gradient projection
spellingShingle Meng Ding
Xiao Fu
Xi-Le Zhao
Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Hyperspectral unmixing
structured block-term tensor decomposition
alternating gradient projection
title Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
title_full Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
title_fullStr Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
title_full_unstemmed Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
title_short Fast and Structured Block-Term Tensor Decomposition for Hyperspectral Unmixing
title_sort fast and structured block term tensor decomposition for hyperspectral unmixing
topic Hyperspectral unmixing
structured block-term tensor decomposition
alternating gradient projection
url https://ieeexplore.ieee.org/document/10022317/
work_keys_str_mv AT mengding fastandstructuredblocktermtensordecompositionforhyperspectralunmixing
AT xiaofu fastandstructuredblocktermtensordecompositionforhyperspectralunmixing
AT xilezhao fastandstructuredblocktermtensordecompositionforhyperspectralunmixing