DPReLU: Dynamic Parametric Rectified Linear Unit and Its Proper Weight Initialization Method

Abstract Activation functions are essential in deep learning, and the rectified linear unit (ReLU) is the most widely used activation function to solve the vanishing gradient problem. However, owing to the dying ReLU problem and bias shift effect, deep learning models using ReLU cannot exploit the p...

Full description

Bibliographic Details
Main Authors: Donghun Yang, Kien Mai Ngoc, Iksoo Shin, Myunggwon Hwang
Format: Article
Language:English
Published: Springer 2023-02-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://doi.org/10.1007/s44196-023-00186-w