On Convergence Properties of the EM Algorithm for Gaussian Mixtures

"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for...

Full description

Bibliographic Details
Main Authors: Jordan, Michael, Xu, Lei
Language:en_US
Published: 2004
Subjects:
Online Access:http://hdl.handle.net/1721.1/7195
_version_ 1811090395820457984
author Jordan, Michael
Xu, Lei
author_facet Jordan, Michael
Xu, Lei
author_sort Jordan, Michael
collection MIT
description "Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.
first_indexed 2024-09-23T14:44:56Z
id mit-1721.1/7195
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T14:44:56Z
publishDate 2004
record_format dspace
spelling mit-1721.1/71952019-04-10T11:52:43Z On Convergence Properties of the EM Algorithm for Gaussian Mixtures Jordan, Michael Xu, Lei learning neural networks EM algorithm clustering mixture models statistics "Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models. 2004-10-20T20:49:25Z 2004-10-20T20:49:25Z 1995-04-21 AIM-1520 CBCL-111 http://hdl.handle.net/1721.1/7195 en_US AIM-1520 CBCL-111 9 p. 291671 bytes 476864 bytes application/postscript application/pdf application/postscript application/pdf
spellingShingle learning
neural networks
EM algorithm
clustering
mixture models
statistics
Jordan, Michael
Xu, Lei
On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title_full On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title_fullStr On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title_full_unstemmed On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title_short On Convergence Properties of the EM Algorithm for Gaussian Mixtures
title_sort on convergence properties of the em algorithm for gaussian mixtures
topic learning
neural networks
EM algorithm
clustering
mixture models
statistics
url http://hdl.handle.net/1721.1/7195
work_keys_str_mv AT jordanmichael onconvergencepropertiesoftheemalgorithmforgaussianmixtures
AT xulei onconvergencepropertiesoftheemalgorithmforgaussianmixtures