A theoretically grounded application of dropout in recurrent neural networks

Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers. Recent results at the intersection of Bayesian modelling and deep...

Ausführliche Beschreibung

Bibliographische Detailangaben
Hauptverfasser: Gal, Y, Ghahramani, Z
Format: Conference item
Sprache:English
Veröffentlicht: Massachusetts Institute of Technology Press 2016