PARAMETRIC FLATTEN-T SWISH: AN ADAPTIVE NONLINEAR ACTIVATION FUNCTION FOR DEEP LEARNING

QActivation function is a key component in deep learning that performs non-linear mappings between the inputs and outputs. Rectified Linear Unit (ReLU) has been the most popular activation function across the deep learning community. However, ReLU contains several shortcomings that can result in in...

Full description

Bibliographic Details
Main Authors: Hock Hung Chieng, Noorhaniza Wahid, Pauline Ong
Format: Article
Language:English
Published: UUM Press 2020-11-01
Series:Journal of ICT
Subjects:
Online Access:https://e-journal.uum.edu.my/index.php/jict/article/view/12398