Pruning Feedforward Polynomial Neural with Smoothing Elastic Net Regularization
Gradient methods are preferred for training and pruning neural networks because regularization terms are primarily intended to remove redundant weights from neural networks. Many machine learning libraries use elastic net regularization (ENR) also called double regularization, which is a combination...
Main Author: | Khidir Shaib Mohamed |
---|---|
Format: | Article |
Language: | English |
Published: |
IFSA Publishing, S.L.
2023-05-01
|
Series: | Sensors & Transducers |
Subjects: | |
Online Access: | https://sensorsportal.com/HTML/DIGEST/may_2023/Vol_260/P_3289.pdf |
Similar Items
-
Batch Gradient Learning Algorithm with Smoothing <inline-formula><math display="inline"><semantics><mrow><msub><mi>L</mi><mn>1</mn></msub></mrow></semantics></math></inline-formula> Regularization for Feedforward Neural Networks
by: Khidir Shaib Mohamed
Published: (2022-12-01) -
A new Sigma-Pi-Sigma neural network based on $ L_1 $ and $ L_2 $ regularization and applications
by: Jianwei Jiao, et al.
Published: (2024-02-01) -
An Application of Elastic-Net Regularized Linear Inverse Problem in Seismic Data Inversion
by: Ronghuo Dai, et al.
Published: (2023-01-01) -
Parsimonious Predictive Mortality Modeling by Regularization and Cross-Validation with and without Covid-Type Effect
by: Karim Barigou, et al.
Published: (2020-12-01) -
Identifying Factors Predicting Kidney Graft Survival in Chile Using Elastic-Net-Regularized Cox’s Regression
by: Leandro Magga, et al.
Published: (2022-09-01)