The Weights Reset Technique for Deep Neural Networks Implicit Regularization

We present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101,...

Full description

Bibliographic Details
Main Authors: Grigoriy Plusch, Sergey Arsenyev-Obraztsov, Olga Kochueva
Format: Article
Language:English
Published: MDPI AG 2023-08-01
Series:Computation
Subjects:
Online Access:https://www.mdpi.com/2079-3197/11/8/148