Markov Information Bottleneck to Improve Information Flow in Stochastic Neural Networks
While rate distortion theory compresses data under a distortion constraint, information bottleneck (IB) generalizes rate distortion theory to learning problems by replacing a distortion constraint with a constraint of relevant information. In this work, we further extend IB to multiple Markov bottle...
Main Authors: | Thanh Tang Nguyen, Jaesik Choi |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/10/976 |
Similar Items
-
On the Difference between the Information Bottleneck and the Deep Information Bottleneck
by: Aleksander Wieczorek, et al.
Published: (2020-01-01) -
Novel Convolutional Neural Network with Variational Information Bottleneck for P300 Detection
by: Hongpeng Liao, et al.
Published: (2020-12-01) -
Splitting of Composite Neural Networks via Proximal Operator With Information Bottleneck
by: Sang-Il Han, et al.
Published: (2024-01-01) -
Information Bottleneck Signal Processing and Learning to Maximize Relevant Information for Communication Receivers
by: Jan Lewandowsky, et al.
Published: (2022-07-01) -
Information Bottleneck: Theory and Applications in Deep Learning
by: Bernhard C. Geiger, et al.
Published: (2020-12-01)