Sequential Normalization: Embracing Smaller Sample Sizes for Normalization
Normalization as a layer within neural networks has over the years demonstrated its effectiveness in neural network optimization across a wide range of different tasks, with one of the most successful approaches being that of batch normalization. The consensus is that better estimates of the BatchNo...
Main Authors: | Neofytos Dimitriou, Ognjen Arandjelović |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-07-01
|
Series: | Information |
Subjects: | |
Online Access: | https://www.mdpi.com/2078-2489/13/7/337 |
Similar Items
-
Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning
by: Liao, Qianli, et al.
Published: (2016) -
Classification Model of Point Cloud Along Transmission Line Based on Group Normalization
by: Zhimin Yin, et al.
Published: (2022-05-01) -
EUNNet: Efficient UN-Normalized Convolution Layer for Stable Training of Deep Residual Networks Without Batch Normalization Layer
by: Khanh-Binh Nguyen, et al.
Published: (2023-01-01) -
Instance Segmentation Based on Improved Self-Adaptive Normalization
by: Sen Yang, et al.
Published: (2022-06-01) -
Adversarial Attacks and Batch Normalization: A Batch Statistics Perspective
by: Awais Muhammad, et al.
Published: (2023-01-01)