On the computational complexity of MCMC-based estimators in large samples

In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks. Our analysis is motivated by the Laplace–Bernstein–Von Mises central limit theorem, which states that...

Full description

Bibliographic Details
Main Authors: Belloni, Alexandre, Chernozhukov, Victor V.
Other Authors: Massachusetts Institute of Technology. Department of Economics
Format: Article
Language:en_US
Published: Institute of Mathematical Statistics 2013
Online Access:http://hdl.handle.net/1721.1/81193
https://orcid.org/0000-0002-3250-6714
_version_ 1826200658624118784
author Belloni, Alexandre
Chernozhukov, Victor V.
author2 Massachusetts Institute of Technology. Department of Economics
author_facet Massachusetts Institute of Technology. Department of Economics
Belloni, Alexandre
Chernozhukov, Victor V.
author_sort Belloni, Alexandre
collection MIT
description In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks. Our analysis is motivated by the Laplace–Bernstein–Von Mises central limit theorem, which states that in large samples the posterior or quasi-posterior approaches a normal density. Using the conditions required for the central limit theorem to hold, we establish polynomial bounds on the computational complexity of general Metropolis random walks methods in large samples. Our analysis covers cases where the underlying log-likelihood or extremum criterion function is possibly nonconcave, discontinuous, and with increasing parameter dimension. However, the central limit theorem restricts the deviations from continuity and log-concavity of the log-likelihood or extremum criterion function in a very specific manner. Under minimal assumptions required for the central limit theorem to hold under the increasing parameter dimension, we show that the Metropolis algorithm is theoretically efficient even for the canonical Gaussian walk which is studied in detail. Specifically, we show that the running time of the algorithm in large samples is bounded in probability by a polynomial in the parameter dimension d and, in particular, is of stochastic order d2 in the leading cases after the burn-in period. We then give applications to exponential families, curved exponential families and Z-estimation of increasing dimension.
first_indexed 2024-09-23T11:39:51Z
format Article
id mit-1721.1/81193
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T11:39:51Z
publishDate 2013
publisher Institute of Mathematical Statistics
record_format dspace
spelling mit-1721.1/811932022-10-01T05:06:14Z On the computational complexity of MCMC-based estimators in large samples Belloni, Alexandre Chernozhukov, Victor V. Massachusetts Institute of Technology. Department of Economics Chernozhukov, Victor V. In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks. Our analysis is motivated by the Laplace–Bernstein–Von Mises central limit theorem, which states that in large samples the posterior or quasi-posterior approaches a normal density. Using the conditions required for the central limit theorem to hold, we establish polynomial bounds on the computational complexity of general Metropolis random walks methods in large samples. Our analysis covers cases where the underlying log-likelihood or extremum criterion function is possibly nonconcave, discontinuous, and with increasing parameter dimension. However, the central limit theorem restricts the deviations from continuity and log-concavity of the log-likelihood or extremum criterion function in a very specific manner. Under minimal assumptions required for the central limit theorem to hold under the increasing parameter dimension, we show that the Metropolis algorithm is theoretically efficient even for the canonical Gaussian walk which is studied in detail. Specifically, we show that the running time of the algorithm in large samples is bounded in probability by a polynomial in the parameter dimension d and, in particular, is of stochastic order d2 in the leading cases after the burn-in period. We then give applications to exponential families, curved exponential families and Z-estimation of increasing dimension. Alfred P. Sloan Foundation (Research Fellowship) National Science Foundation (U.S.) IBM Research (IBM Herman Goldstein Fellowship) Massachusetts Institute of Technology. Dept. of Economics (Castle Krob Chair) 2013-09-26T15:17:18Z 2013-09-26T15:17:18Z 2009-08 2008-06 Article http://purl.org/eprint/type/JournalArticle 0090-5364 http://hdl.handle.net/1721.1/81193 Belloni, Alexandre, and Victor Chernozhukov. “On the computational complexity of MCMC-based estimators in large samples.” The Annals of Statistics 37, no. 4 (August 2009): 2011-2055. © 2009 Institute of Mathematical Statistics. https://orcid.org/0000-0002-3250-6714 en_US http://dx.doi.org/10.1214/08-aos634 Annals of Statistics Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Institute of Mathematical Statistics Institute of Mathematical Statistics
spellingShingle Belloni, Alexandre
Chernozhukov, Victor V.
On the computational complexity of MCMC-based estimators in large samples
title On the computational complexity of MCMC-based estimators in large samples
title_full On the computational complexity of MCMC-based estimators in large samples
title_fullStr On the computational complexity of MCMC-based estimators in large samples
title_full_unstemmed On the computational complexity of MCMC-based estimators in large samples
title_short On the computational complexity of MCMC-based estimators in large samples
title_sort on the computational complexity of mcmc based estimators in large samples
url http://hdl.handle.net/1721.1/81193
https://orcid.org/0000-0002-3250-6714
work_keys_str_mv AT bellonialexandre onthecomputationalcomplexityofmcmcbasedestimatorsinlargesamples
AT chernozhukovvictorv onthecomputationalcomplexityofmcmcbasedestimatorsinlargesamples