Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler diverge...
Main Author: | Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson |
---|---|
Format: | Article |
Language: | English |
Published: |
SciPost
2022-01-01
|
Series: | SciPost Physics |
Online Access: | https://scipost.org/SciPostPhys.12.1.041 |
Similar Items
-
Information geometry in quantum field theory: lessons from simple examples
by: Johanna Erdmenger, Kevin T. Grosvenor, Ro Jefferson
Published: (2020-05-01) -
Renyi relative entropies and renormalization group flows
by: Horacio Casini, et al.
Published: (2018-09-01) -
The edge of chaos: quantum field theory and deep neural networks
by: Kevin T. Grosvenor, Ro Jefferson
Published: (2022-03-01) -
Probing renormalization group flows using entanglement entropy
by: Liu, Hong, et al.
Published: (2014) -
Combining the in-medium similarity renormalization group with the density matrix renormalization group: Shell structure and information entropy
by: A. Tichai, et al.
Published: (2023-10-01)