Gated Orthogonal Recurrent Units: On Learning to Forget

© 2019 Massachusetts Institute of Technology. We present a novel recurrent neural network (RNN)based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extend...

Full description

Bibliographic Details
Main Authors: Jing, Li, Gulcehre, Caglar, Peurifoy, John, Shen, Yichen, Tegmark, Max, Soljacic, Marin, Bengio, Yoshua
Other Authors: Sloan School of Management
Format: Article
Language:English
Published: MIT Press - Journals 2021
Online Access:https://hdl.handle.net/1721.1/135148
Description
Summary:© 2019 Massachusetts Institute of Technology. We present a novel recurrent neural network (RNN)based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extending restricted orthogonal evolution RNNs with a gating mechanism similar to gated recurrent unit RNNs with a reset gate and an update gate. Our model is able to outperform long short-term memory, gated recurrent units, and vanilla unitary or orthogonal RNNs on several long-term-dependency benchmark tasks. We empirically show that both orthogonal and unitary RNNs lack the ability to forget. This ability plays an important role in RNNs. We provide competitive results along with an analysis of our model on many natural sequential tasks, including question answering, speech spectrum prediction, character-level language modeling, and synthetic tasks that involve long-term dependencies such as algorithmic, denoising, and copying tasks.