The Weights Reset Technique for Deep Neural Networks Implicit Regularization

We present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101,...

Full description

Bibliographic Details
Main Authors: Grigoriy Plusch, Sergey Arsenyev-Obraztsov, Olga Kochueva
Format: Article
Language:English
Published: MDPI AG 2023-08-01
Series:Computation
Subjects:
Online Access:https://www.mdpi.com/2079-3197/11/8/148
_version_ 1797585145116491776
author Grigoriy Plusch
Sergey Arsenyev-Obraztsov
Olga Kochueva
author_facet Grigoriy Plusch
Sergey Arsenyev-Obraztsov
Olga Kochueva
author_sort Grigoriy Plusch
collection DOAJ
description We present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101, CIFAR-100 and Imagenette. We compare these results with other traditional regularization methods. The subsequent test results demonstrate that the Weights Reset method is competitive, achieving the best performance on Imagenette dataset and the challenging and unbalanced Caltech-101 dataset. This method also has sufficient potential to prevent vanishing and exploding gradients. However, this analysis is of a brief nature. Further comprehensive studies are needed in order to gain a deep understanding of the computing potential and limitations of the Weights Reset method. The observed results show that the Weights Reset method can be estimated as an effective extension of the traditional regularization methods and can help to improve model performance and generalization.
first_indexed 2024-03-11T00:01:44Z
format Article
id doaj.art-e283bd87e5e2410d98c22bd25bf36dbc
institution Directory Open Access Journal
issn 2079-3197
language English
last_indexed 2024-03-11T00:01:44Z
publishDate 2023-08-01
publisher MDPI AG
record_format Article
series Computation
spelling doaj.art-e283bd87e5e2410d98c22bd25bf36dbc2023-11-19T00:42:53ZengMDPI AGComputation2079-31972023-08-0111814810.3390/computation11080148The Weights Reset Technique for Deep Neural Networks Implicit RegularizationGrigoriy Plusch0Sergey Arsenyev-Obraztsov1Olga Kochueva2Department of Applied Mathematics and Computer Modeling, National University of Oil and Gas “Gubkin University”, 65, Leninsky Prospekt, 119991 Moscow, RussiaDepartment of Applied Mathematics and Computer Modeling, National University of Oil and Gas “Gubkin University”, 65, Leninsky Prospekt, 119991 Moscow, RussiaDepartment of Applied Mathematics and Computer Modeling, National University of Oil and Gas “Gubkin University”, 65, Leninsky Prospekt, 119991 Moscow, RussiaWe present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101, CIFAR-100 and Imagenette. We compare these results with other traditional regularization methods. The subsequent test results demonstrate that the Weights Reset method is competitive, achieving the best performance on Imagenette dataset and the challenging and unbalanced Caltech-101 dataset. This method also has sufficient potential to prevent vanishing and exploding gradients. However, this analysis is of a brief nature. Further comprehensive studies are needed in order to gain a deep understanding of the computing potential and limitations of the Weights Reset method. The observed results show that the Weights Reset method can be estimated as an effective extension of the traditional regularization methods and can help to improve model performance and generalization.https://www.mdpi.com/2079-3197/11/8/148machine learningdeep learningimplicit regularizationcomputer vision
spellingShingle Grigoriy Plusch
Sergey Arsenyev-Obraztsov
Olga Kochueva
The Weights Reset Technique for Deep Neural Networks Implicit Regularization
Computation
machine learning
deep learning
implicit regularization
computer vision
title The Weights Reset Technique for Deep Neural Networks Implicit Regularization
title_full The Weights Reset Technique for Deep Neural Networks Implicit Regularization
title_fullStr The Weights Reset Technique for Deep Neural Networks Implicit Regularization
title_full_unstemmed The Weights Reset Technique for Deep Neural Networks Implicit Regularization
title_short The Weights Reset Technique for Deep Neural Networks Implicit Regularization
title_sort weights reset technique for deep neural networks implicit regularization
topic machine learning
deep learning
implicit regularization
computer vision
url https://www.mdpi.com/2079-3197/11/8/148
work_keys_str_mv AT grigoriyplusch theweightsresettechniquefordeepneuralnetworksimplicitregularization
AT sergeyarsenyevobraztsov theweightsresettechniquefordeepneuralnetworksimplicitregularization
AT olgakochueva theweightsresettechniquefordeepneuralnetworksimplicitregularization
AT grigoriyplusch weightsresettechniquefordeepneuralnetworksimplicitregularization
AT sergeyarsenyevobraztsov weightsresettechniquefordeepneuralnetworksimplicitregularization
AT olgakochueva weightsresettechniquefordeepneuralnetworksimplicitregularization