Principles of Bayesian inference using general divergence criteria
When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback-Leibler (KL)-divergence between the model and...
Main Authors: | Jewson, J, Smith, J, Holmes, C |
---|---|
Format: | Journal article |
Published: |
MDPI
2018
|
Similar Items
-
Principles of Bayesian Inference Using General Divergence Criteria
by: Jack Jewson, et al.
Published: (2018-06-01) -
Dropout inference in Bayesian neural networks with alpha-divergences
by: Li, Y, et al.
Published: (2017) -
Phylogenetic inference under recombination using Bayesian stochastic topology selection.
by: Webb, A, et al.
Published: (2009) -
Improvement and generalization of ABCD method with Bayesian inference
by: Ezequiel Alvarez, Leandro Da Rold, Manuel Szewc, Alejandro Szynkman, Santiago A. Tanco, Tatiana Tarutina
Published: (2024-07-01) -
Measuring and Controlling Bias for Some Bayesian Inferences and the Relation to Frequentist Criteria
by: Michael Evans, et al.
Published: (2021-02-01)