Enhanced mechanisms of pooling and channel attention for deep learning feature maps
The pooling function is vital for deep neural networks (DNNs). The operation is to generalize the representation of feature maps and progressively cut down the spatial size of feature maps to optimize the computing consumption of the network. Furthermore, the function is also the basis for the compu...
Main Authors: | Hengyi Li, Xuebin Yue, Lin Meng |
---|---|
Format: | Article |
Language: | English |
Published: |
PeerJ Inc.
2022-11-01
|
Series: | PeerJ Computer Science |
Subjects: | |
Online Access: | https://peerj.com/articles/cs-1161.pdf |
Similar Items
-
RETRACTED: Deep Fractional Max Pooling Neural Network for COVID-19 Recognition
by: Shui-Hua Wang, et al.
Published: (2021-08-01) -
Image Denoising Using Adaptive and Overlapped Average Filtering and Mixed-Pooling Attention Refinement Networks
by: Ming-Hao Lin, et al.
Published: (2021-05-01) -
Mixed-pooling-dropout for convolutional neural network regularization
by: Brahim Ait Skourt, et al.
Published: (2022-09-01) -
A Feature Fusion Human Ear Recognition Method Based on Channel Features and Dynamic Convolution
by: Xuebin Xu, et al.
Published: (2023-07-01) -
Human action recognition using weighted pooling
by: Wen Zhou, et al.
Published: (2014-12-01)