Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler diverge...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
SciPost
2022-01-01
|
Series: | SciPost Physics |
Online Access: | https://scipost.org/SciPostPhys.12.1.041 |
_version_ | 1818332413804150784 |
---|---|
author | Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson |
author_facet | Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson |
author_sort | Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson |
collection | DOAJ |
description | We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence in both the one- and two-dimensional Ising models under decimation RG, as well as in a feedforward neural network as a function of depth. We observe qualitatively identical behavior characterized by the monotonic increase to a parameter-dependent asymptotic value. On the quantum field theory side, the monotonic increase confirms the connection between the relative entropy and the c-theorem. For the neural networks, the asymptotic behavior may have implications for various information maximization methods in machine learning, as well as for disentangling compactness and generalizability. Furthermore, while both the two-dimensional Ising model and the random neural networks we consider exhibit non-trivial critical points, the relative entropy appears insensitive to the phase structure of either system. In this sense, more refined probes are required in order to fully elucidate the flow of information in these models. |
first_indexed | 2024-12-13T13:35:21Z |
format | Article |
id | doaj.art-20487cf07bf1449aaa627b653dd18b9a |
institution | Directory Open Access Journal |
issn | 2542-4653 |
language | English |
last_indexed | 2024-12-13T13:35:21Z |
publishDate | 2022-01-01 |
publisher | SciPost |
record_format | Article |
series | SciPost Physics |
spelling | doaj.art-20487cf07bf1449aaa627b653dd18b9a2022-12-21T23:43:53ZengSciPostSciPost Physics2542-46532022-01-0112104110.21468/SciPostPhys.12.1.041Towards quantifying information flows: relative entropy in deep neural networks and the renormalization groupJohanna Erdmenger, Kevin T. Grosvenor, Ro JeffersonWe investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence in both the one- and two-dimensional Ising models under decimation RG, as well as in a feedforward neural network as a function of depth. We observe qualitatively identical behavior characterized by the monotonic increase to a parameter-dependent asymptotic value. On the quantum field theory side, the monotonic increase confirms the connection between the relative entropy and the c-theorem. For the neural networks, the asymptotic behavior may have implications for various information maximization methods in machine learning, as well as for disentangling compactness and generalizability. Furthermore, while both the two-dimensional Ising model and the random neural networks we consider exhibit non-trivial critical points, the relative entropy appears insensitive to the phase structure of either system. In this sense, more refined probes are required in order to fully elucidate the flow of information in these models.https://scipost.org/SciPostPhys.12.1.041 |
spellingShingle | Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group SciPost Physics |
title | Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group |
title_full | Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group |
title_fullStr | Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group |
title_full_unstemmed | Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group |
title_short | Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group |
title_sort | towards quantifying information flows relative entropy in deep neural networks and the renormalization group |
url | https://scipost.org/SciPostPhys.12.1.041 |
work_keys_str_mv | AT johannaerdmengerkevintgrosvenorrojefferson towardsquantifyinginformationflowsrelativeentropyindeepneuralnetworksandtherenormalizationgroup |