A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables.
The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AH...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2015-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC4583305?pdf=render |
_version_ | 1818788408904908800 |
---|---|
author | Guillaume Marrelec Arnaud Messé Pierre Bellec |
author_facet | Guillaume Marrelec Arnaud Messé Pierre Bellec |
author_sort | Guillaume Marrelec |
collection | DOAJ |
description | The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering. |
first_indexed | 2024-12-18T14:23:12Z |
format | Article |
id | doaj.art-1af89a1676ae43cb90c8bcdd03ba1e00 |
institution | Directory Open Access Journal |
issn | 1932-6203 |
language | English |
last_indexed | 2024-12-18T14:23:12Z |
publishDate | 2015-01-01 |
publisher | Public Library of Science (PLoS) |
record_format | Article |
series | PLoS ONE |
spelling | doaj.art-1af89a1676ae43cb90c8bcdd03ba1e002022-12-21T21:04:47ZengPublic Library of Science (PLoS)PLoS ONE1932-62032015-01-01109e013727810.1371/journal.pone.0137278A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables.Guillaume MarrelecArnaud MesséPierre BellecThe use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering.http://europepmc.org/articles/PMC4583305?pdf=render |
spellingShingle | Guillaume Marrelec Arnaud Messé Pierre Bellec A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. PLoS ONE |
title | A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. |
title_full | A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. |
title_fullStr | A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. |
title_full_unstemmed | A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. |
title_short | A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables. |
title_sort | bayesian alternative to mutual information for the hierarchical clustering of dependent random variables |
url | http://europepmc.org/articles/PMC4583305?pdf=render |
work_keys_str_mv | AT guillaumemarrelec abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables AT arnaudmesse abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables AT pierrebellec abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables AT guillaumemarrelec bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables AT arnaudmesse bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables AT pierrebellec bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables |