The predictive view of Bayesian inference

<p>This thesis considers the direct connection between the prediction of future observations and Bayesian inference. Using prediction as a guide, we generalize the Bayesian framework and introduce new methodologies for parameter inference and model selection which improve scalability and perfo...

Full description

Bibliographic Details
Main Author: Fong, CHE
Other Authors: Holmes, C
Format: Thesis
Language:English
Published: 2021
Subjects:
Description
Summary:<p>This thesis considers the direct connection between the prediction of future observations and Bayesian inference. Using prediction as a guide, we generalize the Bayesian framework and introduce new methodologies for parameter inference and model selection which improve scalability and performance under model misspecification.</p> <p>We begin with an introduction in Chapter 1 on methodologies for generalizing Bayesian inference, including the Bayesian bootstrap and general Bayesian updating. A summary of the current role of prediction in Bayesian inference is then provided.</p> <p>In Chapter 2, we present a novel perspective on Bayesian inference which points to missing observations as the source of statistical uncertainty. We formally connect the Bayesian posterior on the parameter with the joint predictive distribution on the unobserved remainder of the population. Using this connection, we introduce the martingale posterior, which generalizes the Bayesian posterior. The martingale posterior only requires the elicitation of a predictive model, thus removing the need for the likelihood and prior. We discuss notions of predictive coherence and further introduce new nonparametric predictive models based on a bivariate copula update.</p> <p>In Chapter 3, we investigate the computational benefits of Bayesian nonparametric learning using the Dirichlet process. This is closely related to the Bayesian bootstrap and is a special case of the martingale posterior. We find that the method is robust to model misspecification and is highly scalable due to its parallelizable nature. We further demonstrate that the posterior bootstrap, which approximately samples from the nonparametric posterior, is particularly proficient at targeting multimodal posteriors.</p> <p>In Chapter 4, we introduce a notion of coherent model scoring under general Bayesian updating. We then explore the formal connection between the Bayesian marginal likelihood and cross-validation, and introduce a cumulative cross-validation score which alleviates some of the deficiencies of the marginal likelihood.</p> <p>We provide a summary and discussion of future work in Chapter 5.</p>