Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler diverge...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
SciPost
2022-01-01
|
Series: | SciPost Physics |
Online Access: | https://scipost.org/SciPostPhys.12.1.041 |