AdamB: Decoupled Bayes by Backprop With Gaussian Scale Mixture Prior

Overfitting of neural networks to training data is one of the most significant problems in machine learning. Bayesian neural networks (BNNs) are known to be robust against overfitting owing to their ability to model parameter uncertainty. Bayes by Backprop (BBB), a simple variational inference appro...

Full description

Bibliographic Details
Main Authors: Keigo Nishida, Makoto Taiji
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9874837/
Description
Summary:Overfitting of neural networks to training data is one of the most significant problems in machine learning. Bayesian neural networks (BNNs) are known to be robust against overfitting owing to their ability to model parameter uncertainty. Bayes by Backprop (BBB), a simple variational inference approach that optimizes variational parameters by backpropagation, has been proposed to train BNNs. However, many studies have encountered challenges in terms of variational inference for large-scale models, such as deep learning. Thus, this study proposed Adam with decoupled Bayes by Backprop (AdamB) to stabilize the training of BNNs through the application of the Adam estimator evaluation to the gradient of the neural network. The proposed approach stabilized the noisy gradient of the BBB and mitigated excess changes in the parameters. In addition, AdamB combined with a Gaussian scale mixture as a prior distribution can suppress the intrinsic increase in variational parameters. The proposed AdamB exhibited superior stability compared to training using Adam with vanilla BBB. Further, the covariate shift benchmark using image classification tasks indicated the higher reliability of AdamB than deep ensembles in the case of noise-type covariate shifts. The considerations for stable learning of BNNs by AdamB shown in image classification tasks are expected to be important insights for application to other domains.
ISSN:2169-3536