Sparse Sequential Generalization of K-means for dictionary training on noisy signals

Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However,...

Full description

Bibliographic Details
Main Authors: Sahoo, Sujit Kumar, Makur, Anamitra
Other Authors: School of Electrical and Electronic Engineering
Format: Journal Article
Language:English
Published: 2017
Subjects:
Online Access:https://hdl.handle.net/10356/82295
http://hdl.handle.net/10220/43516
_version_ 1811681020910501888
author Sahoo, Sujit Kumar
Makur, Anamitra
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Sahoo, Sujit Kumar
Makur, Anamitra
author_sort Sahoo, Sujit Kumar
collection NTU
description Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time.
first_indexed 2024-10-01T03:34:19Z
format Journal Article
id ntu-10356/82295
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:34:19Z
publishDate 2017
record_format dspace
spelling ntu-10356/822952020-03-07T14:02:38Z Sparse Sequential Generalization of K-means for dictionary training on noisy signals Sahoo, Sujit Kumar Makur, Anamitra School of Electrical and Electronic Engineering Denoising Sparse representation Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time. MOE (Min. of Education, S’pore) Accepted version 2017-08-02T01:52:38Z 2019-12-06T14:52:43Z 2017-08-02T01:52:38Z 2019-12-06T14:52:43Z 2016 Journal Article Sahoo, S. K., & Makur, A. (2016). Sparse Sequential Generalization of K-means for dictionary training on noisy signals. Signal Processing, 129, 62-66. 0165-1684 https://hdl.handle.net/10356/82295 http://hdl.handle.net/10220/43516 10.1016/j.sigpro.2016.05.036 en Signal Processing © 2016 Elsevier. This is the author created version of a work that has been peer reviewed and accepted for publication by Signal Processing, Elsevier. It incorporates referee’s comments but changes resulting from the publishing process, such as copyediting, structural formatting, may not be reflected in this document. The published version is available at: [http://dx.doi.org/10.1016/j.sigpro.2016.05.036]. 15 p. application/pdf
spellingShingle Denoising
Sparse representation
Sahoo, Sujit Kumar
Makur, Anamitra
Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title_full Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title_fullStr Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title_full_unstemmed Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title_short Sparse Sequential Generalization of K-means for dictionary training on noisy signals
title_sort sparse sequential generalization of k means for dictionary training on noisy signals
topic Denoising
Sparse representation
url https://hdl.handle.net/10356/82295
http://hdl.handle.net/10220/43516
work_keys_str_mv AT sahoosujitkumar sparsesequentialgeneralizationofkmeansfordictionarytrainingonnoisysignals
AT makuranamitra sparsesequentialgeneralizationofkmeansfordictionarytrainingonnoisysignals