Approximated Information Analysis in Bayesian Inference
In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-03-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/3/1441 |
_version_ | 1811308135910998016 |
---|---|
author | Jung In Seo Yongku Kim |
author_facet | Jung In Seo Yongku Kim |
author_sort | Jung In Seo |
collection | DOAJ |
description | In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings. |
first_indexed | 2024-04-13T09:17:37Z |
format | Article |
id | doaj.art-529a6473de9541a7b217b412401df166 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-13T09:17:37Z |
publishDate | 2015-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-529a6473de9541a7b217b412401df1662022-12-22T02:52:42ZengMDPI AGEntropy1099-43002015-03-011731441145110.3390/e17031441e17031441Approximated Information Analysis in Bayesian InferenceJung In Seo0Yongku Kim1Department of Statistics, Yeungnam University, Gyeongsan 712-749, KoreaDepartment of Statistics, Kyungpook National University, Daegu 702-701, KoreaIn models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.http://www.mdpi.com/1099-4300/17/3/1441Bayesian sensitivityGibbs samplerKullback–Leibler divergenceLaplace approximation |
spellingShingle | Jung In Seo Yongku Kim Approximated Information Analysis in Bayesian Inference Entropy Bayesian sensitivity Gibbs sampler Kullback–Leibler divergence Laplace approximation |
title | Approximated Information Analysis in Bayesian Inference |
title_full | Approximated Information Analysis in Bayesian Inference |
title_fullStr | Approximated Information Analysis in Bayesian Inference |
title_full_unstemmed | Approximated Information Analysis in Bayesian Inference |
title_short | Approximated Information Analysis in Bayesian Inference |
title_sort | approximated information analysis in bayesian inference |
topic | Bayesian sensitivity Gibbs sampler Kullback–Leibler divergence Laplace approximation |
url | http://www.mdpi.com/1099-4300/17/3/1441 |
work_keys_str_mv | AT junginseo approximatedinformationanalysisinbayesianinference AT yongkukim approximatedinformationanalysisinbayesianinference |