Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks

Neural networks have revolutionised the way we approach problem solving across multiple domains; however, their effective design and efficient use of computational resources is still a challenging task. One of the most important factors influencing this process is model hyperparameters which vary si...

Full description

Bibliographic Details
Main Authors: Gauri Vaidya, Meghana Kshirsagar, Conor Ryan
Format: Article
Language:English
Published: MDPI AG 2023-06-01
Series:Algorithms
Subjects:
Online Access:https://www.mdpi.com/1999-4893/16/7/319
_version_ 1797590587109539840
author Gauri Vaidya
Meghana Kshirsagar
Conor Ryan
author_facet Gauri Vaidya
Meghana Kshirsagar
Conor Ryan
author_sort Gauri Vaidya
collection DOAJ
description Neural networks have revolutionised the way we approach problem solving across multiple domains; however, their effective design and efficient use of computational resources is still a challenging task. One of the most important factors influencing this process is model hyperparameters which vary significantly with models and datasets. Recently, there has been an increased focus on automatically tuning these hyperparameters to reduce complexity and to optimise resource utilisation. From traditional human-intuitive tuning methods to random search, grid search, Bayesian optimisation, and evolutionary algorithms, significant advancements have been made in this direction that promise improved performance while using fewer resources. In this article, we propose HyperGE, a two-stage model for automatically tuning hyperparameters driven by grammatical evolution (GE), a bioinspired population-based machine learning algorithm. GE provides an advantage in that it allows users to define their own grammar for generating solutions, making it ideal for defining search spaces across datasets and models. We test HyperGE to fine-tune VGG-19 and ResNet-50 pre-trained networks using three benchmark datasets. We demonstrate that the search space is significantly reduced by a factor of ~90% in Stage 2 with fewer number of trials. HyperGE could become an invaluable tool within the deep learning community, allowing practitioners greater freedom when exploring complex problem domains for hyperparameter fine-tuning.
first_indexed 2024-03-11T01:22:40Z
format Article
id doaj.art-8a8307e454264a569b40514aa940e2e1
institution Directory Open Access Journal
issn 1999-4893
language English
last_indexed 2024-03-11T01:22:40Z
publishDate 2023-06-01
publisher MDPI AG
record_format Article
series Algorithms
spelling doaj.art-8a8307e454264a569b40514aa940e2e12023-11-18T17:58:58ZengMDPI AGAlgorithms1999-48932023-06-0116731910.3390/a16070319Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural NetworksGauri Vaidya0Meghana Kshirsagar1Conor Ryan2Biocomputing and Developmental Systems Research Group, University of Limerick, V94 T9PX Limerick, IrelandBiocomputing and Developmental Systems Research Group, University of Limerick, V94 T9PX Limerick, IrelandBiocomputing and Developmental Systems Research Group, University of Limerick, V94 T9PX Limerick, IrelandNeural networks have revolutionised the way we approach problem solving across multiple domains; however, their effective design and efficient use of computational resources is still a challenging task. One of the most important factors influencing this process is model hyperparameters which vary significantly with models and datasets. Recently, there has been an increased focus on automatically tuning these hyperparameters to reduce complexity and to optimise resource utilisation. From traditional human-intuitive tuning methods to random search, grid search, Bayesian optimisation, and evolutionary algorithms, significant advancements have been made in this direction that promise improved performance while using fewer resources. In this article, we propose HyperGE, a two-stage model for automatically tuning hyperparameters driven by grammatical evolution (GE), a bioinspired population-based machine learning algorithm. GE provides an advantage in that it allows users to define their own grammar for generating solutions, making it ideal for defining search spaces across datasets and models. We test HyperGE to fine-tune VGG-19 and ResNet-50 pre-trained networks using three benchmark datasets. We demonstrate that the search space is significantly reduced by a factor of ~90% in Stage 2 with fewer number of trials. HyperGE could become an invaluable tool within the deep learning community, allowing practitioners greater freedom when exploring complex problem domains for hyperparameter fine-tuning.https://www.mdpi.com/1999-4893/16/7/319search space pruningmachine learninggrammatical evolutioncombinatorial optimisationcomputer visionmetaheuristics
spellingShingle Gauri Vaidya
Meghana Kshirsagar
Conor Ryan
Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
Algorithms
search space pruning
machine learning
grammatical evolution
combinatorial optimisation
computer vision
metaheuristics
title Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
title_full Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
title_fullStr Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
title_full_unstemmed Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
title_short Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
title_sort grammatical evolution driven algorithm for efficient and automatic hyperparameter optimisation of neural networks
topic search space pruning
machine learning
grammatical evolution
combinatorial optimisation
computer vision
metaheuristics
url https://www.mdpi.com/1999-4893/16/7/319
work_keys_str_mv AT gaurivaidya grammaticalevolutiondrivenalgorithmforefficientandautomatichyperparameteroptimisationofneuralnetworks
AT meghanakshirsagar grammaticalevolutiondrivenalgorithmforefficientandautomatichyperparameteroptimisationofneuralnetworks
AT conorryan grammaticalevolutiondrivenalgorithmforefficientandautomatichyperparameteroptimisationofneuralnetworks