Studying the Evolution of Neural Activation Patterns During Training of Feed-Forward ReLU Networks

The ability of deep neural networks to form powerful emergent representations of complex statistical patterns in data is as remarkable as imperfectly understood. For deep ReLU networks, these are encoded in the mixed discrete–continuous structure of linear weight matrices and non-linear binary activ...

Full description

Bibliographic Details
Main Authors: David Hartmann, Daniel Franzen, Sebastian Brodehl
Format: Article
Language:English
Published: Frontiers Media S.A. 2021-12-01
Series:Frontiers in Artificial Intelligence
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frai.2021.642374/full