Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method

Feature Selection has been a significant preprocessing procedure for classification in the area of Supervised Machine Learning. It is mostly applied when the attribute set is very large. The large set of attributes often tend to misguide the classifier. Extensive research has been performed to incre...

Full description

Bibliographic Details
Main Authors: G. S. Thejas, Sajal Raj Joshi, S. S. Iyengar, N. R. Sunitha, Prajwal Badrinath
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8805315/
_version_ 1818412631705255936
author G. S. Thejas
Sajal Raj Joshi
S. S. Iyengar
N. R. Sunitha
Prajwal Badrinath
author_facet G. S. Thejas
Sajal Raj Joshi
S. S. Iyengar
N. R. Sunitha
Prajwal Badrinath
author_sort G. S. Thejas
collection DOAJ
description Feature Selection has been a significant preprocessing procedure for classification in the area of Supervised Machine Learning. It is mostly applied when the attribute set is very large. The large set of attributes often tend to misguide the classifier. Extensive research has been performed to increase the efficacy of the predictor by finding the optimal set of features. The feature subset should be such that it enhances the classification accuracy by the removal of redundant features. We propose a new feature selection mechanism, an amalgamation of the filter and the wrapper techniques by taking into consideration the benefits of both the methods. Our hybrid model is based on a two phase process where we rank the features and then choose the best subset of features based on the ranking. We validated our model with various datasets, using multiple evaluation metrics. Furthermore, we have also compared and analyzed our results with previous works. The proposed model outperformed many existent algorithms and has given us good results.
first_indexed 2024-12-14T10:50:23Z
format Article
id doaj.art-766b667e61104e6c8500828434e9c06e
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-14T10:50:23Z
publishDate 2019-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-766b667e61104e6c8500828434e9c06e2022-12-21T23:05:16ZengIEEEIEEE Access2169-35362019-01-01711687511688510.1109/ACCESS.2019.29363468805315Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection MethodG. S. Thejas0https://orcid.org/0000-0001-9606-0128Sajal Raj Joshi1S. S. Iyengar2N. R. Sunitha3Prajwal Badrinath4School of Computing and Information Sciences, Florida International University, Miami, FL, USADepartment of Computer Science Engineering, Siddaganga Institute of Technology, Tumakuru, IndiaSchool of Computing and Information Sciences, Florida International University, Miami, FL, USADepartment of Computer Science Engineering, Siddaganga Institute of Technology, Tumakuru, IndiaSchool of Computing and Information Sciences, Florida International University, Miami, FL, USAFeature Selection has been a significant preprocessing procedure for classification in the area of Supervised Machine Learning. It is mostly applied when the attribute set is very large. The large set of attributes often tend to misguide the classifier. Extensive research has been performed to increase the efficacy of the predictor by finding the optimal set of features. The feature subset should be such that it enhances the classification accuracy by the removal of redundant features. We propose a new feature selection mechanism, an amalgamation of the filter and the wrapper techniques by taking into consideration the benefits of both the methods. Our hybrid model is based on a two phase process where we rank the features and then choose the best subset of features based on the ranking. We validated our model with various datasets, using multiple evaluation metrics. Furthermore, we have also compared and analyzed our results with previous works. The proposed model outperformed many existent algorithms and has given us good results.https://ieeexplore.ieee.org/document/8805315/Feature selectionfilter methodhybrid feature selectionnormalized mutual informationmini batch K-meansrandom forest
spellingShingle G. S. Thejas
Sajal Raj Joshi
S. S. Iyengar
N. R. Sunitha
Prajwal Badrinath
Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
IEEE Access
Feature selection
filter method
hybrid feature selection
normalized mutual information
mini batch K-means
random forest
title Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
title_full Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
title_fullStr Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
title_full_unstemmed Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
title_short Mini-Batch Normalized Mutual Information: A Hybrid Feature Selection Method
title_sort mini batch normalized mutual information a hybrid feature selection method
topic Feature selection
filter method
hybrid feature selection
normalized mutual information
mini batch K-means
random forest
url https://ieeexplore.ieee.org/document/8805315/
work_keys_str_mv AT gsthejas minibatchnormalizedmutualinformationahybridfeatureselectionmethod
AT sajalrajjoshi minibatchnormalizedmutualinformationahybridfeatureselectionmethod
AT ssiyengar minibatchnormalizedmutualinformationahybridfeatureselectionmethod
AT nrsunitha minibatchnormalizedmutualinformationahybridfeatureselectionmethod
AT prajwalbadrinath minibatchnormalizedmutualinformationahybridfeatureselectionmethod