A new iterative initialization of EM algorithm for Gaussian mixture models.

<h4>Background</h4>The expectation maximization (EM) algorithm is a common tool for estimating the parameters of Gaussian mixture models (GMM). However, it is highly sensitive to initial value and easily gets trapped in a local optimum.<h4>Method</h4>To address these problems...

Full description

Bibliographic Details
Main Authors: Jie You, Zhaoxuan Li, Junli Du
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2023-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0284114
Description
Summary:<h4>Background</h4>The expectation maximization (EM) algorithm is a common tool for estimating the parameters of Gaussian mixture models (GMM). However, it is highly sensitive to initial value and easily gets trapped in a local optimum.<h4>Method</h4>To address these problems, a new iterative method of EM initialization (MRIPEM) is proposed in this paper. It incorporates the ideas of multiple restarts, iterations and clustering. In particular, the mean vector and covariance matrix of sample are calculated as the initial values of the iteration. Then, the optimal feature vector is selected from the candidate feature vectors by the maximum Mahalanobis distance as a new partition vector for clustering. The parameter values are renewed continuously according to the clustering results.<h4>Results</h4>To verify the applicability of the MRIPEM, we compared it with other two popular initialization methods on simulated and real datasets, respectively. The comparison results of the three stochastic algorithms indicate that MRIPEM algorithm is comparable in relatively high dimensions and high overlaps and significantly better in low dimensions and low overlaps.
ISSN:1932-6203