PolyLU: A Simple and Robust Polynomial-Based Linear Unit Activation Function for Deep Learning

The activation function has a critical influence on whether a convolutional neural network in deep learning can converge or not; a proper activation function not only makes the convolutional neural network converge faster but also can reduce the complexity of convolutional neural network architectur...

Full description

Bibliographic Details
Main Authors: Han-Shen Feng, Cheng-Hsiung Yang
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10251412/