Beta Distribution-Based Cross-Entropy for Feature Selection

Analysis of high-dimensional data is a challenge in machine learning and data mining. Feature selection plays an important role in dealing with high-dimensional data for improvement of predictive accuracy, as well as better interpretation of the data. Frequently used evaluation functions for feature...

Full description

Bibliographic Details
Main Authors: Weixing Dai, Dianjing Guo
Format: Article
Language:English
Published: MDPI AG 2019-08-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/21/8/769
_version_ 1811187201974730752
author Weixing Dai
Dianjing Guo
author_facet Weixing Dai
Dianjing Guo
author_sort Weixing Dai
collection DOAJ
description Analysis of high-dimensional data is a challenge in machine learning and data mining. Feature selection plays an important role in dealing with high-dimensional data for improvement of predictive accuracy, as well as better interpretation of the data. Frequently used evaluation functions for feature selection include resampling methods such as cross-validation, which show an advantage in predictive accuracy. However, these conventional methods are not only computationally expensive, but also tend to be over-optimistic. We propose a novel cross-entropy which is based on beta distribution for feature selection. In beta distribution-based cross-entropy (BetaDCE) for feature selection, the probability density is estimated by beta distribution and the cross-entropy is computed by the expected value of beta distribution, so that the generalization ability can be estimated more precisely than conventional methods where the probability density is learnt from data. Analysis of the generalization ability of BetaDCE revealed that it was a trade-off between bias and variance. The robustness of BetaDCE was demonstrated by experiments on three types of data. In the exclusive or-like (XOR-like) dataset, the false discovery rate of BetaDCE was significantly smaller than that of other methods. For the leukemia dataset, the area under the curve (AUC) of BetaDCE on the test set was 0.93 with only four selected features, which indicated that BetaDCE not only detected the irrelevant and redundant features precisely, but also more accurately predicted the class labels with a smaller number of features than the original method, whose AUC was 0.83 with 50 features. In the metabonomic dataset, the overall AUC of prediction with features selected by BetaDCE was significantly larger than that by the original reported method. Therefore, BetaDCE can be used as a general and efficient framework for feature selection.
first_indexed 2024-04-11T13:58:07Z
format Article
id doaj.art-85dc9443f1c94aa49b93eb7a2c89d101
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T13:58:07Z
publishDate 2019-08-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-85dc9443f1c94aa49b93eb7a2c89d1012022-12-22T04:20:11ZengMDPI AGEntropy1099-43002019-08-0121876910.3390/e21080769e21080769Beta Distribution-Based Cross-Entropy for Feature SelectionWeixing Dai0Dianjing Guo1School of Life Science and State Key Laboratory of Agrobiotechnology, G94, Science Center South Block, The Chinese University of Hong Kong, Shatin 999077, Hong Kong, ChinaSchool of Life Science and State Key Laboratory of Agrobiotechnology, G94, Science Center South Block, The Chinese University of Hong Kong, Shatin 999077, Hong Kong, ChinaAnalysis of high-dimensional data is a challenge in machine learning and data mining. Feature selection plays an important role in dealing with high-dimensional data for improvement of predictive accuracy, as well as better interpretation of the data. Frequently used evaluation functions for feature selection include resampling methods such as cross-validation, which show an advantage in predictive accuracy. However, these conventional methods are not only computationally expensive, but also tend to be over-optimistic. We propose a novel cross-entropy which is based on beta distribution for feature selection. In beta distribution-based cross-entropy (BetaDCE) for feature selection, the probability density is estimated by beta distribution and the cross-entropy is computed by the expected value of beta distribution, so that the generalization ability can be estimated more precisely than conventional methods where the probability density is learnt from data. Analysis of the generalization ability of BetaDCE revealed that it was a trade-off between bias and variance. The robustness of BetaDCE was demonstrated by experiments on three types of data. In the exclusive or-like (XOR-like) dataset, the false discovery rate of BetaDCE was significantly smaller than that of other methods. For the leukemia dataset, the area under the curve (AUC) of BetaDCE on the test set was 0.93 with only four selected features, which indicated that BetaDCE not only detected the irrelevant and redundant features precisely, but also more accurately predicted the class labels with a smaller number of features than the original method, whose AUC was 0.83 with 50 features. In the metabonomic dataset, the overall AUC of prediction with features selected by BetaDCE was significantly larger than that by the original reported method. Therefore, BetaDCE can be used as a general and efficient framework for feature selection.https://www.mdpi.com/1099-4300/21/8/769cross-entropybeta distributionfeature selectionmachine learningdata mining
spellingShingle Weixing Dai
Dianjing Guo
Beta Distribution-Based Cross-Entropy for Feature Selection
Entropy
cross-entropy
beta distribution
feature selection
machine learning
data mining
title Beta Distribution-Based Cross-Entropy for Feature Selection
title_full Beta Distribution-Based Cross-Entropy for Feature Selection
title_fullStr Beta Distribution-Based Cross-Entropy for Feature Selection
title_full_unstemmed Beta Distribution-Based Cross-Entropy for Feature Selection
title_short Beta Distribution-Based Cross-Entropy for Feature Selection
title_sort beta distribution based cross entropy for feature selection
topic cross-entropy
beta distribution
feature selection
machine learning
data mining
url https://www.mdpi.com/1099-4300/21/8/769
work_keys_str_mv AT weixingdai betadistributionbasedcrossentropyforfeatureselection
AT dianjingguo betadistributionbasedcrossentropyforfeatureselection