A Mutual Information estimator for continuous and discrete variables applied to Feature Selection and Classification problems
Currently Mutual Information has been widely used in pattern recognition and feature selection problems. It may be used as a measure of redundancy between features as well as a measure of dependency evaluating the relevance of each feature. Since marginal densities of real datasets are not usually k...
Main Authors: | Frederico Coelho, Antonio P. Braga, Michel Verleysen |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2016-08-01
|
Series: | International Journal of Computational Intelligence Systems |
Subjects: | |
Online Access: | https://www.atlantis-press.com/article/25868723/view |
Similar Items
-
Multilabel Feature Selection Using Mutual Information and ML-ReliefF for Multilabel Classification
by: Enhui Shi, et al.
Published: (2020-01-01) -
Feature Selection with Conditional Mutual Information Considering Feature Interaction
by: Jun Liang, et al.
Published: (2019-07-01) -
A Filter Feature Selection Algorithm Based on Mutual Information for Intrusion Detection
by: Fei Zhao, et al.
Published: (2018-09-01) -
Feature selection based on fuzzy joint mutual information maximization
by: Omar A. M. Salem, et al.
Published: (2021-04-01) -
Clusters of Features Using Complementary Information Applied to Gender Classification From Face Images
by: Juan E. Tapia, et al.
Published: (2019-01-01)