Markov Information Bottleneck to Improve Information Flow in Stochastic Neural Networks

While rate distortion theory compresses data under a distortion constraint, information bottleneck (IB) generalizes rate distortion theory to learning problems by replacing a distortion constraint with a constraint of relevant information. In this work, we further extend IB to multiple Markov bottle...

Full description

Bibliographic Details
Main Authors: Thanh Tang Nguyen, Jaesik Choi
Format: Article
Language:English
Published: MDPI AG 2019-10-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/21/10/976