Low‐complexity neuron for fixed‐point artificial neural networks with ReLU activation function in energy‐constrained wireless applications

Abstract This work introduces an efficient neuron design for fixed‐point artificial neural networks with the rectified linear unit (ReLU) activation function for energy‐constrained wireless applications. The fixed‐point binary numbers and ReLU activation function are used in most application‐specifi...

Full description

Bibliographic Details
Main Authors: Wen‐Long Chin, Qinyu Zhang, Tao Jiang
Format: Article
Language:English
Published: Wiley 2021-04-01
Series:IET Communications
Subjects:
Online Access:https://doi.org/10.1049/cmu2.12129