Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designed to model long-range depende...

Full description

Bibliographic Details
Main Authors: Dangovski, Rumen, Jing, Li, Nakov, Preslav, Tatalović, Mićo, Soljačić, Marin
Format: Article
Language:English
Published: The MIT Press 2019-11-01
Series:Transactions of the Association for Computational Linguistics
Online Access:https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00258