Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection

The rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these i...

Full description

Bibliographic Details
Main Authors: Rami Sihwail, Khairuddin Omar, Khairul Akram Zainol Ariffin, Mohammad Tubishat
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9131696/
_version_ 1830091573527838720
author Rami Sihwail
Khairuddin Omar
Khairul Akram Zainol Ariffin
Mohammad Tubishat
author_facet Rami Sihwail
Khairuddin Omar
Khairul Akram Zainol Ariffin
Mohammad Tubishat
author_sort Rami Sihwail
collection DOAJ
description The rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these issues. It works by eliminating the features that may negatively influence the classifiers' performance, such as irrelevant, redundant and less informative features. This paper aims to introduce an improved Harris hawks optimization (IHHO) by utilizing elite opposite-based learning and proposing a new search mechanism. Harris hawks optimization (HHO) is a novel metaheuristic general-purpose algorithm recently introduced to solve continuous search problems. Compared to conventional HHO, the proposed IHHO can avoid trapping in local optima and has an enhanced search mechanism, relying on mutation, mutation neighborhood search, and rollback strategies to raise the search capabilities. Moreover, it improves population diversity, computational accuracy, and accelerates convergence rate. To evaluate the performance of IHHO, we conducted a series of experiments on twenty benchmark datasets collected from the UCI repository and the scikit-feature project. The datasets represent different levels of feature dimensionality, such as low, moderate, and high. Further, four criteria were adopted to determine the superiority of IHHO: classification accuracy, fitness value, number of selected features, and statistical tests. Furthermore, a comparison between IHHO and other well-known algorithms such as Generic algorithm (GA), Grasshopper Optimization Algorithm (GOA), Particle Swarm Optimization (PSO), Ant Lion Optimizer (ALO), Whale Optimization Algorithm (WOA), Butterfly Optimization Algorithm (BOA) and Slime Mould Algorithm (SMA) was performed. The experimental results have confirmed the dominance of IHHO over the other optimization algorithms in different aspects, such as accuracy, fitness value, and feature selection.
first_indexed 2024-12-16T17:23:47Z
format Article
id doaj.art-bbf553bd3d08490e8b1e936d2b6ae5d9
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-16T17:23:47Z
publishDate 2020-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-bbf553bd3d08490e8b1e936d2b6ae5d92022-12-21T22:23:07ZengIEEEIEEE Access2169-35362020-01-01812112712114510.1109/ACCESS.2020.30064739131696Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature SelectionRami Sihwail0https://orcid.org/0000-0001-8326-3655Khairuddin Omar1Khairul Akram Zainol Ariffin2https://orcid.org/0000-0003-3627-556XMohammad Tubishat3Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia, Bangi, MalaysiaFaculty of Information Science and Technology, Universiti Kebangsaan Malaysia, Bangi, MalaysiaFaculty of Information Science and Technology, Universiti Kebangsaan Malaysia, Bangi, MalaysiaSchool of Technology and Computing, Asia Pacific University of Technology and Innovation, Kuala Lumpur, MalaysiaThe rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these issues. It works by eliminating the features that may negatively influence the classifiers' performance, such as irrelevant, redundant and less informative features. This paper aims to introduce an improved Harris hawks optimization (IHHO) by utilizing elite opposite-based learning and proposing a new search mechanism. Harris hawks optimization (HHO) is a novel metaheuristic general-purpose algorithm recently introduced to solve continuous search problems. Compared to conventional HHO, the proposed IHHO can avoid trapping in local optima and has an enhanced search mechanism, relying on mutation, mutation neighborhood search, and rollback strategies to raise the search capabilities. Moreover, it improves population diversity, computational accuracy, and accelerates convergence rate. To evaluate the performance of IHHO, we conducted a series of experiments on twenty benchmark datasets collected from the UCI repository and the scikit-feature project. The datasets represent different levels of feature dimensionality, such as low, moderate, and high. Further, four criteria were adopted to determine the superiority of IHHO: classification accuracy, fitness value, number of selected features, and statistical tests. Furthermore, a comparison between IHHO and other well-known algorithms such as Generic algorithm (GA), Grasshopper Optimization Algorithm (GOA), Particle Swarm Optimization (PSO), Ant Lion Optimizer (ALO), Whale Optimization Algorithm (WOA), Butterfly Optimization Algorithm (BOA) and Slime Mould Algorithm (SMA) was performed. The experimental results have confirmed the dominance of IHHO over the other optimization algorithms in different aspects, such as accuracy, fitness value, and feature selection.https://ieeexplore.ieee.org/document/9131696/Harris Hawks optimizationoptimizationfeature selectionelite opposite based-learningmutationmutation neighborhood search
spellingShingle Rami Sihwail
Khairuddin Omar
Khairul Akram Zainol Ariffin
Mohammad Tubishat
Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
IEEE Access
Harris Hawks optimization
optimization
feature selection
elite opposite based-learning
mutation
mutation neighborhood search
title Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
title_full Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
title_fullStr Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
title_full_unstemmed Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
title_short Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection
title_sort improved harris hawks optimization using elite opposition based learning and novel search mechanism for feature selection
topic Harris Hawks optimization
optimization
feature selection
elite opposite based-learning
mutation
mutation neighborhood search
url https://ieeexplore.ieee.org/document/9131696/
work_keys_str_mv AT ramisihwail improvedharrishawksoptimizationusingeliteoppositionbasedlearningandnovelsearchmechanismforfeatureselection
AT khairuddinomar improvedharrishawksoptimizationusingeliteoppositionbasedlearningandnovelsearchmechanismforfeatureselection
AT khairulakramzainolariffin improvedharrishawksoptimizationusingeliteoppositionbasedlearningandnovelsearchmechanismforfeatureselection
AT mohammadtubishat improvedharrishawksoptimizationusingeliteoppositionbasedlearningandnovelsearchmechanismforfeatureselection