A Novel Deep-Learning Model Compression Based on Filter-Stripe Group Pruning and Its IoT Application
Nowadays, there is a tradeoff between the deep-learning module-compression ratio and the module accuracy. In this paper, a strategy for refining the pruning quantification and weights based on neural network filters is proposed. Firstly, filters in the neural network were refined into strip-like fil...
Main Authors: | Ming Zhao, Xindi Tong, Weixian Wu, Zhen Wang, Bingxue Zhou, Xiaodan Huang |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-07-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/22/15/5623 |
Similar Items
-
Magnitude and Similarity Based Variable Rate Filter Pruning for Efficient Convolution Neural Networks
by: Deepak Ghimire, et al.
Published: (2022-12-01) -
Smart guide : pruning /
by: Smith, Miranda, 1944-
Published: (2009) -
A Novel Channel Pruning Compression Algorithm Combined with an Attention Mechanism
by: Ming Zhao, et al.
Published: (2023-04-01) -
Weight pruning-UNet: Weight pruning UNet with depth-wise separable convolutions for semantic segmentation of kidney tumors
by: Patike Kiran Rao, et al.
Published: (2022-01-01) -
Compression of Deep Convolutional Neural Network Using Additional Importance-Weight-Based Filter Pruning Approach
by: Shrutika S. Sawant, et al.
Published: (2022-11-01)