Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification
In recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffe...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Taylor & Francis Group
2023-12-01
|
Series: | Applied Artificial Intelligence |
Online Access: | http://dx.doi.org/10.1080/08839514.2023.2180821 |
_version_ | 1797684808533409792 |
---|---|
author | Kun Jiang Lei Zhu Qindong Sun |
author_facet | Kun Jiang Lei Zhu Qindong Sun |
author_sort | Kun Jiang |
collection | DOAJ |
description | In recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffers from the redundant and noisy samples in real-world datasets. In this paper, we propose a joint Dual-Structural constrained and Non-negative Analysis Representation (DSNAR) learning model. First, the supervised latent structural transformation term is considered implicitly to generate a roughly block diagonal representation for intra-class samples. However, this discriminative structure is fragile and weak in the presence of noisy and redundant samples. To highlight both intra-class similarity and inter-class separation for class-oriented representation, we then explicitly incorporate an off-block suppressing term on the ADL model, together with a non-negative representation constraint, to achieve a well-structured and meaningful interpretation of the contributions from all class-oriented atoms. Moreover, a robust classification scheme in latent space is proposed to avoid accidental incorrect predictions with noisy information. Finally, the DSNAR model is alternatively solved by the K-SVD method, iterative re-weighted method and gradient method efficiently. Extensive classification results on five benchmark datasets validate the performance superiority of our DSNAR model compared to other state-of-the-art DL models. |
first_indexed | 2024-03-12T00:35:09Z |
format | Article |
id | doaj.art-6cc8ab2d6bd748d6a3d513de71ecc34f |
institution | Directory Open Access Journal |
issn | 0883-9514 1087-6545 |
language | English |
last_indexed | 2024-03-12T00:35:09Z |
publishDate | 2023-12-01 |
publisher | Taylor & Francis Group |
record_format | Article |
series | Applied Artificial Intelligence |
spelling | doaj.art-6cc8ab2d6bd748d6a3d513de71ecc34f2023-09-15T10:01:05ZengTaylor & Francis GroupApplied Artificial Intelligence0883-95141087-65452023-12-0137110.1080/08839514.2023.21808212180821Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern ClassificationKun Jiang0Lei Zhu1Qindong Sun2Xi’an University of TechnologyXi’an University of TechnologyXi’an Jiaotong UniversityIn recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffers from the redundant and noisy samples in real-world datasets. In this paper, we propose a joint Dual-Structural constrained and Non-negative Analysis Representation (DSNAR) learning model. First, the supervised latent structural transformation term is considered implicitly to generate a roughly block diagonal representation for intra-class samples. However, this discriminative structure is fragile and weak in the presence of noisy and redundant samples. To highlight both intra-class similarity and inter-class separation for class-oriented representation, we then explicitly incorporate an off-block suppressing term on the ADL model, together with a non-negative representation constraint, to achieve a well-structured and meaningful interpretation of the contributions from all class-oriented atoms. Moreover, a robust classification scheme in latent space is proposed to avoid accidental incorrect predictions with noisy information. Finally, the DSNAR model is alternatively solved by the K-SVD method, iterative re-weighted method and gradient method efficiently. Extensive classification results on five benchmark datasets validate the performance superiority of our DSNAR model compared to other state-of-the-art DL models.http://dx.doi.org/10.1080/08839514.2023.2180821 |
spellingShingle | Kun Jiang Lei Zhu Qindong Sun Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification Applied Artificial Intelligence |
title | Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification |
title_full | Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification |
title_fullStr | Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification |
title_full_unstemmed | Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification |
title_short | Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification |
title_sort | joint dual structural constrained and non negative analysis representation learning for pattern classification |
url | http://dx.doi.org/10.1080/08839514.2023.2180821 |
work_keys_str_mv | AT kunjiang jointdualstructuralconstrainedandnonnegativeanalysisrepresentationlearningforpatternclassification AT leizhu jointdualstructuralconstrainedandnonnegativeanalysisrepresentationlearningforpatternclassification AT qindongsun jointdualstructuralconstrainedandnonnegativeanalysisrepresentationlearningforpatternclassification |