Downward-Growing Neural Networks

A major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities...

Full description

Bibliographic Details
Main Authors: Vincenzo Laveglia, Edmondo Trentin
Format: Article
Language:English
Published: MDPI AG 2023-04-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/25/5/733
_version_ 1827741413527257088
author Vincenzo Laveglia
Edmondo Trentin
author_facet Vincenzo Laveglia
Edmondo Trentin
author_sort Vincenzo Laveglia
collection DOAJ
description A major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities of the automatic learner). Facing this issue boosted the development of algorithms for automatically growing and pruning the architectures as part of the learning process. The paper introduces a novel approach to growing the architecture of deep neural networks, called downward-growing neural network (DGNN). The approach can be applied to arbitrary feed-forward deep neural networks. Groups of neurons that negatively affect the performance of the network are selected and grown with the aim of improving the learning and generalization capabilities of the resulting machine. The growing process is realized via replacement of these groups of neurons with sub-networks that are trained relying on ad hoc target propagation techniques. In so doing, the growth process takes place simultaneously in both the depth and width of the DGNN architecture. We assess empirically the effectiveness of the DGNN on several UCI datasets, where the DGNN significantly improves the average accuracy over a range of established deep neural network approaches and over two popular growing algorithms, namely, the AdaNet and the cascade correlation neural network.
first_indexed 2024-03-11T03:46:10Z
format Article
id doaj.art-e2f23612ba3d47eaa8c4acb7149e7ad3
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-11T03:46:10Z
publishDate 2023-04-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-e2f23612ba3d47eaa8c4acb7149e7ad32023-11-18T01:15:37ZengMDPI AGEntropy1099-43002023-04-0125573310.3390/e25050733Downward-Growing Neural NetworksVincenzo Laveglia0Edmondo Trentin1DINFO, Università di Firenze, Via di S. Marta 3, 50139 Firenze, ItalyDIISM, Università di Siena, Via Roma 56, 53100 Siena, ItalyA major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities of the automatic learner). Facing this issue boosted the development of algorithms for automatically growing and pruning the architectures as part of the learning process. The paper introduces a novel approach to growing the architecture of deep neural networks, called downward-growing neural network (DGNN). The approach can be applied to arbitrary feed-forward deep neural networks. Groups of neurons that negatively affect the performance of the network are selected and grown with the aim of improving the learning and generalization capabilities of the resulting machine. The growing process is realized via replacement of these groups of neurons with sub-networks that are trained relying on ad hoc target propagation techniques. In so doing, the growth process takes place simultaneously in both the depth and width of the DGNN architecture. We assess empirically the effectiveness of the DGNN on several UCI datasets, where the DGNN significantly improves the average accuracy over a range of established deep neural network approaches and over two popular growing algorithms, namely, the AdaNet and the cascade correlation neural network.https://www.mdpi.com/1099-4300/25/5/733deep neural networkdeep learningadaptive architecturegrowing neural networktarget propagation
spellingShingle Vincenzo Laveglia
Edmondo Trentin
Downward-Growing Neural Networks
Entropy
deep neural network
deep learning
adaptive architecture
growing neural network
target propagation
title Downward-Growing Neural Networks
title_full Downward-Growing Neural Networks
title_fullStr Downward-Growing Neural Networks
title_full_unstemmed Downward-Growing Neural Networks
title_short Downward-Growing Neural Networks
title_sort downward growing neural networks
topic deep neural network
deep learning
adaptive architecture
growing neural network
target propagation
url https://www.mdpi.com/1099-4300/25/5/733
work_keys_str_mv AT vincenzolaveglia downwardgrowingneuralnetworks
AT edmondotrentin downwardgrowingneuralnetworks