Evidence-Based Regularization for Neural Networks

Numerous approaches address over-fitting in neural networks: by imposing a penalty on the parameters of the network (L1, L2, etc.); by changing the network stochastically (drop-out, Gaussian noise, etc.); or by transforming the input data (batch normalization, etc.). In contrast, we aim to ensure th...

Full description

Bibliographic Details
Main Authors: Giuseppe Nuti, Andreea-Ingrid Cross, Philipp Rindler
Format: Article
Language:English
Published: MDPI AG 2022-11-01
Series:Machine Learning and Knowledge Extraction
Subjects:
Online Access:https://www.mdpi.com/2504-4990/4/4/51