HcLSH: A Novel Non-Linear Monotonic Activation Function for Deep Learning Methods

Activation functions are essential components in any neural network model; they play a crucial role in determining the network’s expressive power through their introduced non-linearity. Rectified Linear Unit (ReLU) has been the famous and default choice for most deep neural network models...

Full description

Bibliographic Details
Main Authors: Heba Abdel-Nabi, Ghazi Al-Naymat, Mostafa Z. Ali, Arafat Awajan
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10124188/