Expanding the capabilities of normalizing flows in deep generative models and variational inference
<p>Deep generative models and variational Bayesian inference are two frameworks for reasoning about observed high-dimensional data, which may even be combined. A fundamental requirement of either approach is the parametrization of an expressive family of density models. Normalizing flows, some...
Main Author: | Caterini, AL |
---|---|
Other Authors: | Doucet, A |
Format: | Thesis |
Language: | English |
Published: |
2021
|
Subjects: |
Similar Items
-
Fast and correct variational inference for probabilistic programming: Differentiability, reparameterisation and smoothing
by: Wagner, D
Published: (2023) -
Likelihood-free Bayesian inference for dynamic, stochastic simulators in the social sciences
by: Dyer, J
Published: (2022) -
Cross-scale generative adversarial network for crowd density estimation from images
by: Zhang, Gaowei, et al.
Published: (2022) -
Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning
by: Liao, Qianli, et al.
Published: (2016) -
Normal forms [filem]
Published: (1971)