Low Power Neural Network by Reducing SRAM Operating Voltage

With advancements in machine learning technology, networks are becoming increasingly complex, and the extent of the computation involved is increasing. Consequently, the computation time and power consumption of the learning process are increased. The error tolerance of neural networks has attracted...

Full description

Bibliographic Details
Main Authors: Keisuke Kozu, Yuya Tanabe, Masato Kitakami, Kazuteru Namba
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9936634/
Description
Summary:With advancements in machine learning technology, networks are becoming increasingly complex, and the extent of the computation involved is increasing. Consequently, the computation time and power consumption of the learning process are increased. The error tolerance of neural networks has attracted attention as an approach to solving this problem. Because neural networks can tolerate small errors, it is possible to reduce the calculation speed and power consumption at the expense of accuracy. In this study, we propose a method to reduce the power consumption of the circuit by lowering the operating voltage of the static random-access memory (SRAM) that is utilized to store the weights. In the proposed method, using two different operating voltages of SRAM, we used different bit error rates (BERs) for error-tolerant and non-error-tolerant. We demonstrated the relationship between the BER and recognition rate, and the appropriate combination of the BER and circuit configuration that maintains a high recognition rate.
ISSN:2169-3536