The Prior Can Often Only Be Understood in the Context of the Likelihood
A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key concep...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/19/10/555 |
_version_ | 1818014996706099200 |
---|---|
author | Andrew Gelman Daniel Simpson Michael Betancourt |
author_facet | Andrew Gelman Daniel Simpson Michael Betancourt |
author_sort | Andrew Gelman |
collection | DOAJ |
description | A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation. |
first_indexed | 2024-04-14T06:52:16Z |
format | Article |
id | doaj.art-4882cf6466934e858d4cccf6b333c297 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-14T06:52:16Z |
publishDate | 2017-10-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-4882cf6466934e858d4cccf6b333c2972022-12-22T02:07:00ZengMDPI AGEntropy1099-43002017-10-01191055510.3390/e19100555e19100555The Prior Can Often Only Be Understood in the Context of the LikelihoodAndrew Gelman0Daniel Simpson1Michael Betancourt2Department of Statistics, Columbia University, New York, NY 10027, USADepartment of Statistical Sciences, University of Toronto, Toronto, ON M5S, CanadaInstitute for Social and Economic Research and Policy, Columbia University, New York, NY 10027, USAA key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.https://www.mdpi.com/1099-4300/19/10/555Bayesian inferencedefault priorsprior distribution |
spellingShingle | Andrew Gelman Daniel Simpson Michael Betancourt The Prior Can Often Only Be Understood in the Context of the Likelihood Entropy Bayesian inference default priors prior distribution |
title | The Prior Can Often Only Be Understood in the Context of the Likelihood |
title_full | The Prior Can Often Only Be Understood in the Context of the Likelihood |
title_fullStr | The Prior Can Often Only Be Understood in the Context of the Likelihood |
title_full_unstemmed | The Prior Can Often Only Be Understood in the Context of the Likelihood |
title_short | The Prior Can Often Only Be Understood in the Context of the Likelihood |
title_sort | prior can often only be understood in the context of the likelihood |
topic | Bayesian inference default priors prior distribution |
url | https://www.mdpi.com/1099-4300/19/10/555 |
work_keys_str_mv | AT andrewgelman thepriorcanoftenonlybeunderstoodinthecontextofthelikelihood AT danielsimpson thepriorcanoftenonlybeunderstoodinthecontextofthelikelihood AT michaelbetancourt thepriorcanoftenonlybeunderstoodinthecontextofthelikelihood AT andrewgelman priorcanoftenonlybeunderstoodinthecontextofthelikelihood AT danielsimpson priorcanoftenonlybeunderstoodinthecontextofthelikelihood AT michaelbetancourt priorcanoftenonlybeunderstoodinthecontextofthelikelihood |