SinLU: Sinu-Sigmoidal Linear Unit

Non-linear activation functions are integral parts of deep neural architectures. Given the large and complex dataset of a neural network, its computational complexity and approximation capability can differ significantly based on what activation function is used. Parameterizing an activation functio...

Full description

Bibliographic Details
Main Authors: Ashis Paul, Rajarshi Bandyopadhyay, Jin Hee Yoon, Zong Woo Geem, Ram Sarkar
Format: Article
Language:English
Published: MDPI AG 2022-01-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/3/337