Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection

Feature selection is an important preprocessing step in pattern classification and machine learning, and mutual information is widely used to measure relevance between features and decision. However, it is difficult to directly calculate relevance between continuous or fuzzy features using mutual in...

Full description

Bibliographic Details
Main Authors: Daren Yu, Shuang An, Qinghua Hu
Format: Article
Language:English
Published: Springer 2011-08-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://www.atlantis-press.com/article/2353.pdf
Description
Summary:Feature selection is an important preprocessing step in pattern classification and machine learning, and mutual information is widely used to measure relevance between features and decision. However, it is difficult to directly calculate relevance between continuous or fuzzy features using mutual information. In this paper we introduce the fuzzy information entropy and fuzzy mutual information for computing relevance between numerical or fuzzy features and decision. The relationship between fuzzy information entropy and differential entropy is also discussed. Moreover, we combine fuzzy mutual information with "min-Redundancy-Max-Relevance", "Max-Dependency" and min-Redundancy-Max-Dependency" algorithms. The performance and stability of the proposed algorithms are tested on benchmark data sets. Experimental results show the proposed algorithms are effective and stable.
ISSN:1875-6883