A Large-Scale Study of Activation Functions in Modern Deep Neural Network Architectures for Efficient Convergence

Activation functions play an important role in the convergence of learning algorithms based on neural networks. Theyprovide neural networks with nonlinear ability and the possibility to fit in any complex data. However, no deep study exists in theliterature on the comportment of activation function...

Full description

Bibliographic Details
Main Authors: Andrinandrasana David Rasamoelina, Ivan Cík, Peter Sincak, Marián Mach, Lukáš Hruška
Format: Article
Language:English
Published: Asociación Española para la Inteligencia Artificial 2022-12-01
Series:Inteligencia Artificial
Subjects:
Online Access:https://journal.iberamia.org/index.php/intartif/article/view/845
Description
Summary:Activation functions play an important role in the convergence of learning algorithms based on neural networks. Theyprovide neural networks with nonlinear ability and the possibility to fit in any complex data. However, no deep study exists in theliterature on the comportment of activation functions in modern architecture. Therefore, in this research, we compare the 18 most used activation functions on multiple datasets (CIFAR-10, CIFAR-100, CALTECH-256) using 4 different models (EfficientNet,ResNet, a variation of ResNet using the bag of tricks, and MobileNet V3). Furthermore, we explore the shape of the losslandscape of those different architectures with various activation functions. Lastly, based on the result of our experimentation,we introduce a new locally quadratic activation function namely Hytana alongside one variation Parametric Hytana whichoutperforms common activation functions and address the dying ReLU problem.
ISSN:1137-3601
1988-3064