RELU-Function and Derived Function Review
The activation function plays an important role in training and improving performance in deep neural networks (dnn). The rectified linear unit (relu) function provides the necessary non-linear properties in the deep neural network (dnn). However, few papers sort out and compare various relu activati...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
EDP Sciences
2022-01-01
|
Series: | SHS Web of Conferences |
Online Access: | https://www.shs-conferences.org/articles/shsconf/pdf/2022/14/shsconf_stehf2022_02006.pdf |