Flipover outperforms dropout in deep learning
Abstract Flipover, an enhanced dropout technique, is introduced to improve the robustness of artificial neural networks. In contrast to dropout, which involves randomly removing certain neurons and their connections, flipover randomly selects neurons and reverts their outputs using a negative multip...
Main Authors: | Yuxuan Liang, Chuang Niu, Pingkun Yan, Ge Wang |
---|---|
Format: | Article |
Language: | English |
Published: |
SpringerOpen
2024-02-01
|
Series: | Visual Computing for Industry, Biomedicine, and Art |
Subjects: | |
Online Access: | https://doi.org/10.1186/s42492-024-00153-y |
Similar Items
-
<inline-formula> <tex-math notation="LaTeX">$\beta$ </tex-math></inline-formula>-Dropout: A Unified Dropout
by: Lei Liu, et al.
Published: (2019-01-01) -
A Review on Dropout Regularization Approaches for Deep Neural Networks within the Scholarly Domain
by: Imrus Salehin, et al.
Published: (2023-07-01) -
FocusedDropout for Convolutional Neural Network
by: Minghui Liu, et al.
Published: (2022-07-01) -
Integrating Dropout and Kullback-Leibler Regularization in Bayesian Neural Networks for improved uncertainty estimation in Regression
by: Raghavendra M. Devadas, et al.
Published: (2024-06-01) -
Scaleable input gradient regularization for adversarial robustness
by: Chris Finlay, et al.
Published: (2021-03-01)