Target–aware Bayesian inference: how to beat optimal conventional estimators
Standard approaches for Bayesian inference focus solely on approximating the posterior distribution. Typically, this approximation is, in turn, used to calculate expectations for one or more target functions—a computational pipeline that is inefficient when the target function(s) are know...
Κύριοι συγγραφείς: | , , , |
---|---|
Μορφή: | Journal article |
Γλώσσα: | English |
Έκδοση: |
Journal of Machine Learning Research
2020
|
_version_ | 1826285099438571520 |
---|---|
author | Rainforth, T Goliński, A Wood, F Zaidi, S |
author_facet | Rainforth, T Goliński, A Wood, F Zaidi, S |
author_sort | Rainforth, T |
collection | OXFORD |
description | Standard approaches for Bayesian inference focus solely on approximating the posterior distribution. Typically, this approximation is, in turn, used to calculate expectations for one or more target functions—a computational pipeline that is inefficient when the target function(s) are known upfront. We address this inefficiency by introducing a framework for target-aware Bayesian inference (TABI) that estimates these expectations directly. While conventional Monte Carlo estimators have a fundamental limit on the error they can achieve for a given sample size, our TABI framework is able to breach this limit; it can theoretically produce arbitrarily accurate estimators using only three samples, while we show empirically that it can also breach this limit in practice. We utilize our TABI framework by combining it with adaptive importance sampling approaches and show both theoretically and empirically that the resulting estimators are capable of converging faster than the standard O(1/N) Monte Carlo rate, potentially producing rates as fast as O(1/N2). We further combine our TABI framework with amortized inference methods, to produce a method for amortizing the cost of calculating expectations. Finally, we show how TABI can be used to convert any marginal likelihood estimator into a target aware inference scheme and demonstrate the substantial benefits this can yield. |
first_indexed | 2024-03-07T01:23:47Z |
format | Journal article |
id | oxford-uuid:913eb733-760d-44f0-bf26-897023914d20 |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T01:23:47Z |
publishDate | 2020 |
publisher | Journal of Machine Learning Research |
record_format | dspace |
spelling | oxford-uuid:913eb733-760d-44f0-bf26-897023914d202022-03-26T23:17:32ZTarget–aware Bayesian inference: how to beat optimal conventional estimatorsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:913eb733-760d-44f0-bf26-897023914d20EnglishSymplectic ElementsJournal of Machine Learning Research2020Rainforth, TGoliński, AWood, FZaidi, SStandard approaches for Bayesian inference focus solely on approximating the posterior distribution. Typically, this approximation is, in turn, used to calculate expectations for one or more target functions—a computational pipeline that is inefficient when the target function(s) are known upfront. We address this inefficiency by introducing a framework for target-aware Bayesian inference (TABI) that estimates these expectations directly. While conventional Monte Carlo estimators have a fundamental limit on the error they can achieve for a given sample size, our TABI framework is able to breach this limit; it can theoretically produce arbitrarily accurate estimators using only three samples, while we show empirically that it can also breach this limit in practice. We utilize our TABI framework by combining it with adaptive importance sampling approaches and show both theoretically and empirically that the resulting estimators are capable of converging faster than the standard O(1/N) Monte Carlo rate, potentially producing rates as fast as O(1/N2). We further combine our TABI framework with amortized inference methods, to produce a method for amortizing the cost of calculating expectations. Finally, we show how TABI can be used to convert any marginal likelihood estimator into a target aware inference scheme and demonstrate the substantial benefits this can yield. |
spellingShingle | Rainforth, T Goliński, A Wood, F Zaidi, S Target–aware Bayesian inference: how to beat optimal conventional estimators |
title | Target–aware Bayesian inference: how to beat optimal conventional estimators |
title_full | Target–aware Bayesian inference: how to beat optimal conventional estimators |
title_fullStr | Target–aware Bayesian inference: how to beat optimal conventional estimators |
title_full_unstemmed | Target–aware Bayesian inference: how to beat optimal conventional estimators |
title_short | Target–aware Bayesian inference: how to beat optimal conventional estimators |
title_sort | target aware bayesian inference how to beat optimal conventional estimators |
work_keys_str_mv | AT rainfortht targetawarebayesianinferencehowtobeatoptimalconventionalestimators AT golinskia targetawarebayesianinferencehowtobeatoptimalconventionalestimators AT woodf targetawarebayesianinferencehowtobeatoptimalconventionalestimators AT zaidis targetawarebayesianinferencehowtobeatoptimalconventionalestimators |