To what extent is tuned neural network pruning beneficial in software effort estimation?

Software effort estimation (SEE) is of great importance for planning the budgets of future projects. The models of SEE are developed depending on the enhancements of hardware technology. However, developing such models based on neural networks remarkably increases the burden of computation. Neural n...

Full description

Bibliographic Details
Main Author: Muhammed Maruf Ozturk
Format: Article
Language:English
Published: Vladimir Andrunachievici Institute of Mathematics and Computer Science 2021-12-01
Series:Computer Science Journal of Moldova
Subjects:
Online Access:http://www.math.md/files/csjm/v29-n3/v29-n3-(pp340-365).pdf
_version_ 1798026092791988224
author Muhammed Maruf Ozturk
author_facet Muhammed Maruf Ozturk
author_sort Muhammed Maruf Ozturk
collection DOAJ
description Software effort estimation (SEE) is of great importance for planning the budgets of future projects. The models of SEE are developed depending on the enhancements of hardware technology. However, developing such models based on neural networks remarkably increases the burden of computation. Neural network pruning may provide a suitable alternative to alleviate that burden. By detecting the elements making insignificant contributions to the output of a trained neural network, it is thus possible to obtain a reliable model. Otherwise, valuable information extracted from a trained neural network may be lost in pruning. In this work, the effects of pruning multi-layer perceptron (MLP) are investigated on SEE. To experimentally evaluate those effects, eight SEE data sets are employed. To find the optimal configuration of MLP, four optimization methods are utilized along with two pruning techniques. The results show that each optimization method has a distinctive threshold to suspend pruning. The model established to reach a low error of SEE, the number of features having low standard deviations should be greater than that of the features having high standard deviations. If a tuning process is applied to the hyperparameters of the pruning algorithm, the genetic algorithm is recommended to obtain high accuracy in the classification. This work provides a guideline for researchers to understand the effectiveness of neural network pruning in SEE.
first_indexed 2024-04-11T18:29:39Z
format Article
id doaj.art-42a21532cd7d446c89ba4447a29a485b
institution Directory Open Access Journal
issn 1561-4042
language English
last_indexed 2024-04-11T18:29:39Z
publishDate 2021-12-01
publisher Vladimir Andrunachievici Institute of Mathematics and Computer Science
record_format Article
series Computer Science Journal of Moldova
spelling doaj.art-42a21532cd7d446c89ba4447a29a485b2022-12-22T04:09:29ZengVladimir Andrunachievici Institute of Mathematics and Computer ScienceComputer Science Journal of Moldova1561-40422021-12-01293(87)340365To what extent is tuned neural network pruning beneficial in software effort estimation?Muhammed Maruf Ozturk0Department of Computer Engineering, Faculty of Engineering, Isparta, TURKEYSoftware effort estimation (SEE) is of great importance for planning the budgets of future projects. The models of SEE are developed depending on the enhancements of hardware technology. However, developing such models based on neural networks remarkably increases the burden of computation. Neural network pruning may provide a suitable alternative to alleviate that burden. By detecting the elements making insignificant contributions to the output of a trained neural network, it is thus possible to obtain a reliable model. Otherwise, valuable information extracted from a trained neural network may be lost in pruning. In this work, the effects of pruning multi-layer perceptron (MLP) are investigated on SEE. To experimentally evaluate those effects, eight SEE data sets are employed. To find the optimal configuration of MLP, four optimization methods are utilized along with two pruning techniques. The results show that each optimization method has a distinctive threshold to suspend pruning. The model established to reach a low error of SEE, the number of features having low standard deviations should be greater than that of the features having high standard deviations. If a tuning process is applied to the hyperparameters of the pruning algorithm, the genetic algorithm is recommended to obtain high accuracy in the classification. This work provides a guideline for researchers to understand the effectiveness of neural network pruning in SEE.http://www.math.md/files/csjm/v29-n3/v29-n3-(pp340-365).pdfeffort estimationhyperparameter optimizationneural network pruning
spellingShingle Muhammed Maruf Ozturk
To what extent is tuned neural network pruning beneficial in software effort estimation?
Computer Science Journal of Moldova
effort estimation
hyperparameter optimization
neural network pruning
title To what extent is tuned neural network pruning beneficial in software effort estimation?
title_full To what extent is tuned neural network pruning beneficial in software effort estimation?
title_fullStr To what extent is tuned neural network pruning beneficial in software effort estimation?
title_full_unstemmed To what extent is tuned neural network pruning beneficial in software effort estimation?
title_short To what extent is tuned neural network pruning beneficial in software effort estimation?
title_sort to what extent is tuned neural network pruning beneficial in software effort estimation
topic effort estimation
hyperparameter optimization
neural network pruning
url http://www.math.md/files/csjm/v29-n3/v29-n3-(pp340-365).pdf
work_keys_str_mv AT muhammedmarufozturk towhatextentistunedneuralnetworkpruningbeneficialinsoftwareeffortestimation