Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning

Memristor crossbars can be very useful for realizing edge-intelligence hardware, because the neural networks implemented by memristor crossbars can save significantly more computing energy and layout area than the conventional CMOS (complementary metal–oxide–semiconductor) digital circuits. One of t...

Full description

Bibliographic Details
Main Authors: Seokjin Oh, Jiyong An, Kyeong-Sik Min
Format: Article
Language:English
Published: MDPI AG 2023-01-01
Series:Micromachines
Subjects:
Online Access:https://www.mdpi.com/2072-666X/14/2/309
_version_ 1827756366168588288
author Seokjin Oh
Jiyong An
Kyeong-Sik Min
author_facet Seokjin Oh
Jiyong An
Kyeong-Sik Min
author_sort Seokjin Oh
collection DOAJ
description Memristor crossbars can be very useful for realizing edge-intelligence hardware, because the neural networks implemented by memristor crossbars can save significantly more computing energy and layout area than the conventional CMOS (complementary metal–oxide–semiconductor) digital circuits. One of the important operations used in neural networks is convolution. For performing the convolution by memristor crossbars, the full image should be partitioned into several sub-images. By doing so, each sub-image convolution can be mapped to small-size unit crossbars, of which the size should be defined as 128 × 128 or 256 × 256 to avoid the line resistance problem caused from large-size crossbars. In this paper, various convolution schemes with 3D, 2D, and 1D kernels are analyzed and compared in terms of neural network’s performance and overlapping overhead. The neural network’s simulation indicates that the 2D + 1D kernels can perform the sub-image convolution using a much smaller number of unit crossbars with less rate loss than the 3D kernels. When the CIFAR-10 dataset is tested, the mapping of sub-image convolution of 2D + 1D kernels to crossbars shows that the number of unit crossbars can be reduced almost by 90% and 95%, respectively, for 128 × 128 and 256 × 256 crossbars, compared with the 3D kernels. On the contrary, the rate loss of 2D + 1D kernels can be less than 2%. To improve the neural network’s performance more, the 2D + 1D kernels can be combined with 3D kernels in one neural network. When the normalized ratio of 2D + 1D layers is around 0.5, the neural network’s performance indicates very little rate loss compared to when the normalized ratio of 2D + 1D layers is zero. However, the number of unit crossbars for the normalized ratio = 0.5 can be reduced by half compared with that for the normalized ratio = 0.
first_indexed 2024-03-11T08:24:17Z
format Article
id doaj.art-76d61d54357f4041bbe811ec2db707e9
institution Directory Open Access Journal
issn 2072-666X
language English
last_indexed 2024-03-11T08:24:17Z
publishDate 2023-01-01
publisher MDPI AG
record_format Article
series Micromachines
spelling doaj.art-76d61d54357f4041bbe811ec2db707e92023-11-16T22:10:26ZengMDPI AGMicromachines2072-666X2023-01-0114230910.3390/mi14020309Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image PartitioningSeokjin Oh0Jiyong An1Kyeong-Sik Min2School of Electrical Engineering, Kookmin University, Seoul 02707, Republic of KoreaSchool of Electrical Engineering, Kookmin University, Seoul 02707, Republic of KoreaSchool of Electrical Engineering, Kookmin University, Seoul 02707, Republic of KoreaMemristor crossbars can be very useful for realizing edge-intelligence hardware, because the neural networks implemented by memristor crossbars can save significantly more computing energy and layout area than the conventional CMOS (complementary metal–oxide–semiconductor) digital circuits. One of the important operations used in neural networks is convolution. For performing the convolution by memristor crossbars, the full image should be partitioned into several sub-images. By doing so, each sub-image convolution can be mapped to small-size unit crossbars, of which the size should be defined as 128 × 128 or 256 × 256 to avoid the line resistance problem caused from large-size crossbars. In this paper, various convolution schemes with 3D, 2D, and 1D kernels are analyzed and compared in terms of neural network’s performance and overlapping overhead. The neural network’s simulation indicates that the 2D + 1D kernels can perform the sub-image convolution using a much smaller number of unit crossbars with less rate loss than the 3D kernels. When the CIFAR-10 dataset is tested, the mapping of sub-image convolution of 2D + 1D kernels to crossbars shows that the number of unit crossbars can be reduced almost by 90% and 95%, respectively, for 128 × 128 and 256 × 256 crossbars, compared with the 3D kernels. On the contrary, the rate loss of 2D + 1D kernels can be less than 2%. To improve the neural network’s performance more, the 2D + 1D kernels can be combined with 3D kernels in one neural network. When the normalized ratio of 2D + 1D layers is around 0.5, the neural network’s performance indicates very little rate loss compared to when the normalized ratio of 2D + 1D layers is zero. However, the number of unit crossbars for the normalized ratio = 0.5 can be reduced by half compared with that for the normalized ratio = 0.https://www.mdpi.com/2072-666X/14/2/309area-efficient mappingconvolutional neural networksmemristor crossbarssub-image partitioning
spellingShingle Seokjin Oh
Jiyong An
Kyeong-Sik Min
Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
Micromachines
area-efficient mapping
convolutional neural networks
memristor crossbars
sub-image partitioning
title Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
title_full Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
title_fullStr Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
title_full_unstemmed Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
title_short Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
title_sort area efficient mapping of convolutional neural networks to memristor crossbars using sub image partitioning
topic area-efficient mapping
convolutional neural networks
memristor crossbars
sub-image partitioning
url https://www.mdpi.com/2072-666X/14/2/309
work_keys_str_mv AT seokjinoh areaefficientmappingofconvolutionalneuralnetworkstomemristorcrossbarsusingsubimagepartitioning
AT jiyongan areaefficientmappingofconvolutionalneuralnetworkstomemristorcrossbarsusingsubimagepartitioning
AT kyeongsikmin areaefficientmappingofconvolutionalneuralnetworkstomemristorcrossbarsusingsubimagepartitioning