Low‐complexity neuron for fixed‐point artificial neural networks with ReLU activation function in energy‐constrained wireless applications
Abstract This work introduces an efficient neuron design for fixed‐point artificial neural networks with the rectified linear unit (ReLU) activation function for energy‐constrained wireless applications. The fixed‐point binary numbers and ReLU activation function are used in most application‐specifi...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-04-01
|
Series: | IET Communications |
Subjects: | |
Online Access: | https://doi.org/10.1049/cmu2.12129 |