Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification
In recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffe...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Taylor & Francis Group
2023-12-01
|
Series: | Applied Artificial Intelligence |
Online Access: | http://dx.doi.org/10.1080/08839514.2023.2180821 |
Summary: | In recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffers from the redundant and noisy samples in real-world datasets. In this paper, we propose a joint Dual-Structural constrained and Non-negative Analysis Representation (DSNAR) learning model. First, the supervised latent structural transformation term is considered implicitly to generate a roughly block diagonal representation for intra-class samples. However, this discriminative structure is fragile and weak in the presence of noisy and redundant samples. To highlight both intra-class similarity and inter-class separation for class-oriented representation, we then explicitly incorporate an off-block suppressing term on the ADL model, together with a non-negative representation constraint, to achieve a well-structured and meaningful interpretation of the contributions from all class-oriented atoms. Moreover, a robust classification scheme in latent space is proposed to avoid accidental incorrect predictions with noisy information. Finally, the DSNAR model is alternatively solved by the K-SVD method, iterative re-weighted method and gradient method efficiently. Extensive classification results on five benchmark datasets validate the performance superiority of our DSNAR model compared to other state-of-the-art DL models. |
---|---|
ISSN: | 0883-9514 1087-6545 |