Learnable Leaky ReLU (LeLeLU): An Alternative Accuracy-Optimized Activation Function

In neural networks, a vital component in the learning and inference process is the activation function. There are many different approaches, but only nonlinear activation functions allow such networks to compute non-trivial problems by using only a small number of nodes, and such activation function...

Full description

Bibliographic Details
Main Authors: Andreas Maniatopoulos, Nikolaos Mitianoudis
Format: Article
Language:English
Published: MDPI AG 2021-12-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/12/12/513