Bayesian time series models and scalable inference

Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.

Bibliographic Details
Main Author: Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology
Other Authors: Alan S. Willsky.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2014
Subjects:
Online Access:http://hdl.handle.net/1721.1/89993
_version_ 1826217616585261056
author Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology
author2 Alan S. Willsky.
author_facet Alan S. Willsky.
Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology
author_sort Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology
collection MIT
description Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.
first_indexed 2024-09-23T17:06:30Z
format Thesis
id mit-1721.1/89993
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T17:06:30Z
publishDate 2014
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/899932019-04-12T07:17:17Z Bayesian time series models and scalable inference Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology Alan S. Willsky. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Electrical Engineering and Computer Science. Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014. Cataloged from PDF version of thesis. Includes bibliographical references (pages 197-206). With large and growing datasets and complex models, there is an increasing need for scalable Bayesian inference. We describe two lines of work to address this need. In the first part, we develop new algorithms for inference in hierarchical Bayesian time series models based on the hidden Markov model (HMM), hidden semi-Markov model (HSMM), and their Bayesian nonparametric extensions. The HMM is ubiquitous in Bayesian time series models, and it and its Bayesian nonparametric extension, the hierarchical Dirichlet process hidden Markov model (HDP-HMM), have been applied in many settings. HSMMs and HDP-HSMMs extend these dynamical models to provide state-specific duration modeling, but at the cost of increased computational complexity for inference, limiting their general applicability. A challenge with all such models is scaling inference to large datasets. We address these challenges in several ways. First, we develop classes of duration models for which HSMM message passing complexity scales only linearly in the observation sequence length. Second, we apply the stochastic variational inference (SVI) framework to develop scalable inference for the HMM, HSMM, and their nonparametric extensions. Third, we build on these ideas to define a new Bayesian nonparametric model that can capture dynamics at multiple timescales while still allowing efficient and scalable inference. In the second part of this thesis, we develop a theoretical framework to analyze a special case of a highly parallelizable sampling strategy we refer to as Hogwild Gibbs sampling. Thorough empirical work has shown that Hogwild Gibbs sampling works very well for inference in large latent Dirichlet allocation models (LDA), but there is little theory to understand when it may be effective in general. By studying Hogwild Gibbs applied to sampling from Gaussian distributions we develop analytical results as well as a deeper understanding of its behavior, including its convergence and correctness in some regimes. by Matthew James Johnson. Ph. D. 2014-09-19T21:33:09Z 2014-09-19T21:33:09Z 2014 2014 Thesis http://hdl.handle.net/1721.1/89993 890131732 eng M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582 206 pages application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
Johnson, Matthew James, Ph. D. Massachusetts Institute of Technology
Bayesian time series models and scalable inference
title Bayesian time series models and scalable inference
title_full Bayesian time series models and scalable inference
title_fullStr Bayesian time series models and scalable inference
title_full_unstemmed Bayesian time series models and scalable inference
title_short Bayesian time series models and scalable inference
title_sort bayesian time series models and scalable inference
topic Electrical Engineering and Computer Science.
url http://hdl.handle.net/1721.1/89993
work_keys_str_mv AT johnsonmatthewjamesphdmassachusettsinstituteoftechnology bayesiantimeseriesmodelsandscalableinference