A Comparison Study About Parameter Optimization Using Swarm Algorithms

Adjusting the parameters of a machine learning algorithm can be difficult if the possible domain of expansion of these parameters is too high. In addition, if a sensible parameter is not adjusted correctly, the changes can be very impactful in the final results, making adjusting it manually not triv...

Full description

Bibliographic Details
Main Authors: Halcyon Davys Pereira De Carvalho, Wedson L. Soares, Wylliams Barbosa Santos, Roberta Fagundes
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9775108/
_version_ 1811231984192585728
author Halcyon Davys Pereira De Carvalho
Wedson L. Soares
Wylliams Barbosa Santos
Roberta Fagundes
author_facet Halcyon Davys Pereira De Carvalho
Wedson L. Soares
Wylliams Barbosa Santos
Roberta Fagundes
author_sort Halcyon Davys Pereira De Carvalho
collection DOAJ
description Adjusting the parameters of a machine learning algorithm can be difficult if the possible domain of expansion of these parameters is too high. In addition, if a sensible parameter is not adjusted correctly, the changes can be very impactful in the final results, making adjusting it manually not trivial. In order to adjust these features automatically, the current work proposes six models based on the use of optimization algorithms to adjust the models’ parameters automatically. These models were built around two machine learning-based algorithms, an extreme learning machine neural network, and a support vector regression. The optimization algorithms used are Particle Swarm Optimization, the Artificial Bee Colony, and the genetic algorithm. The models were compared with each other based on predictive precision in the criterion of Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and statistical tests. The experimental results on ten datasets in different contexts indicated that optimized algorithms models perform better in convergence, precision, and robustness than the non-optimized algorithms models. Therefore, the automatic adjustment of the parameters of optimized algorithms is a powerful tool for analyzing different data contexts. Thus, this study shows that the optimized algorithm models (in particular the ELM PSO model) are more accurate than all experimental evaluations.
first_indexed 2024-04-12T10:55:39Z
format Article
id doaj.art-776edd11f9c14ca79458423fa8e8c01a
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-12T10:55:39Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-776edd11f9c14ca79458423fa8e8c01a2022-12-22T03:36:06ZengIEEEIEEE Access2169-35362022-01-0110554885549810.1109/ACCESS.2022.31752029775108A Comparison Study About Parameter Optimization Using Swarm AlgorithmsHalcyon Davys Pereira De Carvalho0https://orcid.org/0000-0001-8933-5912Wedson L. Soares1https://orcid.org/0000-0002-0078-3944Wylliams Barbosa Santos2https://orcid.org/0000-0003-2578-1248Roberta Fagundes3https://orcid.org/0000-0002-7172-4183Department of Computer Engineering, University of Pernambuco, Recife, BrazilDepartment of Computer Engineering, University of Pernambuco, Recife, BrazilDepartment of Computer Engineering, University of Pernambuco, Recife, BrazilDepartment of Computer Engineering, University of Pernambuco, Recife, BrazilAdjusting the parameters of a machine learning algorithm can be difficult if the possible domain of expansion of these parameters is too high. In addition, if a sensible parameter is not adjusted correctly, the changes can be very impactful in the final results, making adjusting it manually not trivial. In order to adjust these features automatically, the current work proposes six models based on the use of optimization algorithms to adjust the models’ parameters automatically. These models were built around two machine learning-based algorithms, an extreme learning machine neural network, and a support vector regression. The optimization algorithms used are Particle Swarm Optimization, the Artificial Bee Colony, and the genetic algorithm. The models were compared with each other based on predictive precision in the criterion of Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and statistical tests. The experimental results on ten datasets in different contexts indicated that optimized algorithms models perform better in convergence, precision, and robustness than the non-optimized algorithms models. Therefore, the automatic adjustment of the parameters of optimized algorithms is a powerful tool for analyzing different data contexts. Thus, this study shows that the optimized algorithm models (in particular the ELM PSO model) are more accurate than all experimental evaluations.https://ieeexplore.ieee.org/document/9775108/Machine learningextreme learning machinesupport vector regressionensembleoptimization algorithm
spellingShingle Halcyon Davys Pereira De Carvalho
Wedson L. Soares
Wylliams Barbosa Santos
Roberta Fagundes
A Comparison Study About Parameter Optimization Using Swarm Algorithms
IEEE Access
Machine learning
extreme learning machine
support vector regression
ensemble
optimization algorithm
title A Comparison Study About Parameter Optimization Using Swarm Algorithms
title_full A Comparison Study About Parameter Optimization Using Swarm Algorithms
title_fullStr A Comparison Study About Parameter Optimization Using Swarm Algorithms
title_full_unstemmed A Comparison Study About Parameter Optimization Using Swarm Algorithms
title_short A Comparison Study About Parameter Optimization Using Swarm Algorithms
title_sort comparison study about parameter optimization using swarm algorithms
topic Machine learning
extreme learning machine
support vector regression
ensemble
optimization algorithm
url https://ieeexplore.ieee.org/document/9775108/
work_keys_str_mv AT halcyondavyspereiradecarvalho acomparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT wedsonlsoares acomparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT wylliamsbarbosasantos acomparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT robertafagundes acomparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT halcyondavyspereiradecarvalho comparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT wedsonlsoares comparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT wylliamsbarbosasantos comparisonstudyaboutparameteroptimizationusingswarmalgorithms
AT robertafagundes comparisonstudyaboutparameteroptimizationusingswarmalgorithms