Efficient algorithms for learning mixture models
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Language: | eng |
Published: |
Massachusetts Institute of Technology
2017
|
Subjects: | |
Online Access: | http://hdl.handle.net/1721.1/107337 |
_version_ | 1826207862759620608 |
---|---|
author | Huang, Qingqing, Ph. D. Massachusetts Institute of Technology |
author2 | Munther A. Dahleh. |
author_facet | Munther A. Dahleh. Huang, Qingqing, Ph. D. Massachusetts Institute of Technology |
author_sort | Huang, Qingqing, Ph. D. Massachusetts Institute of Technology |
collection | MIT |
description | Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016. |
first_indexed | 2024-09-23T13:56:08Z |
format | Thesis |
id | mit-1721.1/107337 |
institution | Massachusetts Institute of Technology |
language | eng |
last_indexed | 2024-09-23T13:56:08Z |
publishDate | 2017 |
publisher | Massachusetts Institute of Technology |
record_format | dspace |
spelling | mit-1721.1/1073372019-04-11T05:28:30Z Efficient algorithms for learning mixture models Huang, Qingqing, Ph. D. Massachusetts Institute of Technology Munther A. Dahleh. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Electrical Engineering and Computer Science. Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016. Cataloged from PDF version of thesis. Includes bibliographical references (pages 261-274). We study the statistical learning problems for a class of probabilistic models called mixture models. Mixture models are usually used to model settings where the observed data consists of different sub-populations, yet we only have access to a limited number of samples of the pooled data. It includes many widely used models such as Gaussian mixtures models, Hidden Markov Models, and topic models. We focus on parametric learning: given unlabeled data generated according to a mixture model, infer about the parameters of the underlying model. The hierarchical structure of the probabilistic model leads to non-convexity of the likelihood function in the model parameters, thus imposing great challenges in finding statistically efficient and computationally efficient solutions. We start with a simple, yet general setup of mixture model in the first part. We study the problem of estimating a low rank M x M matrix which represents a discrete distribution over M2 outcomes, given access to sample drawn according to the distribution. We propose a learning algorithm that accurately recovers the underlying matrix using 9(M) number of samples, which immediately lead to improved learning algorithms for various mixture models including topic models and HMMs. We show that the linear sample complexity is actually optimal in the min-max sense. There are "hard" mixture models for which there exist worst case lower bounds of sample complexity that scale exponentially in the model dimensions. In the second part, we study Gaussian mixture models and HMMs. We propose new learning algorithms with polynomial runtime. We leverage techniques in probabilistic analysis to prove that worst case instances are actually rare, and our algorithm can efficiently handle all the non-worst case instances. In the third part, we study the problem of super-resolution. Despite the lower bound for any deterministic algorithm, we propose a new randomized algorithm which complexity scales only quadratically in all dimensions, and show that it can handle any instance with high probability over the randomization. by Qingqing Huang. Ph. D. 2017-03-10T15:05:59Z 2017-03-10T15:05:59Z 2016 2016 Thesis http://hdl.handle.net/1721.1/107337 972905777 eng MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582 274 pages application/pdf Massachusetts Institute of Technology |
spellingShingle | Electrical Engineering and Computer Science. Huang, Qingqing, Ph. D. Massachusetts Institute of Technology Efficient algorithms for learning mixture models |
title | Efficient algorithms for learning mixture models |
title_full | Efficient algorithms for learning mixture models |
title_fullStr | Efficient algorithms for learning mixture models |
title_full_unstemmed | Efficient algorithms for learning mixture models |
title_short | Efficient algorithms for learning mixture models |
title_sort | efficient algorithms for learning mixture models |
topic | Electrical Engineering and Computer Science. |
url | http://hdl.handle.net/1721.1/107337 |
work_keys_str_mv | AT huangqingqingphdmassachusettsinstituteoftechnology efficientalgorithmsforlearningmixturemodels |