Universal characteristics of deep neural network loss surfaces from random matrix theory

This paper considers several aspects of random matrix universality in deep neural networks (DNNs). Motivated by recent experimental work, we use universal properties of random matrices related to local statistics to derive practical implications for DNNs based on a realistic model of their Hessians....

Full description

Bibliographic Details
Main Authors: Baskerville, N, Keating, JP, Mezzadri, F, Najnudel, J, Granziol, D
Format: Journal article
Language:English
Published: IOP Publishing 2022
Description
Summary:This paper considers several aspects of random matrix universality in deep neural networks (DNNs). Motivated by recent experimental work, we use universal properties of random matrices related to local statistics to derive practical implications for DNNs based on a realistic model of their Hessians. In particular we derive universal aspects of outliers in the spectra of deep neural networks and demonstrate the important role of random matrix local laws in popular pre-conditioning gradient descent algorithms. We also present insights into DNN loss surfaces from quite general arguments based on tools from statistical physics and random matrix theory.