SHE-MTJ based ReLU-max pooling functions for on-chip training of neural networks

We present a detailed investigation of various routes to optimize the power consumption of the spintronic-based devices for implementing rectified linear activation (ReLU) and max-pooling functions. We examine the influence of various spin Hall effect layers, and their input resistances on the power...

Full description

Bibliographic Details
Main Authors: Venkatesh Vadde, Bhaskaran Muralidharan, Abhishek Sharma
Format: Article
Language:English
Published: AIP Publishing LLC 2024-02-01
Series:AIP Advances
Online Access:http://dx.doi.org/10.1063/9.0000685