Dropout inference in Bayesian neural networks with alpha-divergences
To obtain uncertainty estimates with real-world Bayesian deep learning models, practical inference approximations are needed. Dropout variational inference (VI) for example has been used for machine vision and medical applications, but VI can severely underestimates model uncertainty. Alpha-divergen...
Main Authors: | Li, Y, Gal, Y |
---|---|
Format: | Conference item |
Published: |
PMLR
2017
|
Similar Items
-
A theoretically grounded application of dropout in recurrent neural networks
by: Gal, Y, et al.
Published: (2016) -
Concrete dropout
by: Gal, Y, et al.
Published: (2018) -
Principles of Bayesian inference using general divergence criteria
by: Jewson, J, et al.
Published: (2018) -
Understanding approximation for Bayesian inference in neural networks
by: Farquhar, S
Published: (2022) -
Neural networks for inference, inference for neural networks
by: Webb, S
Published: (2018)