A Novel Similarity Based Unsupervised Technique for Training Convolutional Filters

Achieving satisfactory results with Convolutional Neural Networks (CNNs) depends on how effectively the filters are trained. Conventionally, an appropriate number of filters is carefully selected, the filters are initialized with a proper initialization method and trained with backpropagation over s...

Full description

Bibliographic Details
Main Authors: Tugba Erkoc, Mustafa Taner Eskil
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10128130/
Description
Summary:Achieving satisfactory results with Convolutional Neural Networks (CNNs) depends on how effectively the filters are trained. Conventionally, an appropriate number of filters is carefully selected, the filters are initialized with a proper initialization method and trained with backpropagation over several epochs. This training scheme requires a large labeled dataset, which is costly and time-consuming to obtain. In this study, we propose an unsupervised approach that extracts convolutional filters from a given dataset in a self-organized manner by processing the training set only once without using backpropagation training. The proposed method allows for the extraction of filters from a given dataset in the absence of labels. In contrast to previous studies, we no longer need to select the best number of filters and a suitable filter weight initialization scheme. Applying this method to the MNIST, EMNIST-Digits, Kuzushiji-MNIST, and Fashion-MNIST datasets yields high test performances of 99.19%, 99.39%, 95.03%, and 90.11%, respectively, without applying backpropagation training or using any preprocessed and augmented data.
ISSN:2169-3536