Self-gated rectified linear unit for performance improvement of deep neural networks
This technical paper proposes an activation function, self-gated rectified linear unit (SGReLU), to achieve high classification accuracy, low loss, and low computational time. Vanishing gradient problem, dying ReLU, noise vulnerability are also resolved in our proposed SGReLU function. SGReLU’s perf...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2023-06-01
|
Series: | ICT Express |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2405959521001776 |