Learning in Feedforward Neural Networks Accelerated by Transfer Entropy

Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially...

Full description

Bibliographic Details
Main Authors: Adrian Moldovan, Angel Caţaron, Răzvan Andonie
Format: Article
Language:English
Published: MDPI AG 2020-01-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/22/1/102
_version_ 1811299590630014976
author Adrian Moldovan
Angel Caţaron
Răzvan Andonie
author_facet Adrian Moldovan
Angel Caţaron
Răzvan Andonie
author_sort Adrian Moldovan
collection DOAJ
description Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.
first_indexed 2024-04-13T06:38:07Z
format Article
id doaj.art-10c11725120c44eb9334c194b75fbb97
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-13T06:38:07Z
publishDate 2020-01-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-10c11725120c44eb9334c194b75fbb972022-12-22T02:57:50ZengMDPI AGEntropy1099-43002020-01-0122110210.3390/e22010102e22010102Learning in Feedforward Neural Networks Accelerated by Transfer EntropyAdrian Moldovan0Angel Caţaron1Răzvan Andonie2Department of Electronics and Computers, Transilvania University, 500024 Braşov, RomaniaDepartment of Electronics and Computers, Transilvania University, 500024 Braşov, RomaniaDepartment of Computer Science, Central Washington University, Ellensburg, WA 98926, USACurrent neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.https://www.mdpi.com/1099-4300/22/1/102transfer entropycausalityneural networkbackpropagationgradient descentdeep learning
spellingShingle Adrian Moldovan
Angel Caţaron
Răzvan Andonie
Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
Entropy
transfer entropy
causality
neural network
backpropagation
gradient descent
deep learning
title Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
title_full Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
title_fullStr Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
title_full_unstemmed Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
title_short Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
title_sort learning in feedforward neural networks accelerated by transfer entropy
topic transfer entropy
causality
neural network
backpropagation
gradient descent
deep learning
url https://www.mdpi.com/1099-4300/22/1/102
work_keys_str_mv AT adrianmoldovan learninginfeedforwardneuralnetworksacceleratedbytransferentropy
AT angelcataron learninginfeedforwardneuralnetworksacceleratedbytransferentropy
AT razvanandonie learninginfeedforwardneuralnetworksacceleratedbytransferentropy