Self-gated rectified linear unit for performance improvement of deep neural networks

This technical paper proposes an activation function, self-gated rectified linear unit (SGReLU), to achieve high classification accuracy, low loss, and low computational time. Vanishing gradient problem, dying ReLU, noise vulnerability are also resolved in our proposed SGReLU function. SGReLU’s perf...

Full description

Bibliographic Details
Main Authors: Israt Jahan, Md. Faisal Ahmed, Md. Osman Ali, Yeong Min Jang
Format: Article
Language:English
Published: Elsevier 2023-06-01
Series:ICT Express
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2405959521001776