Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

<jats:p> Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designed to model lon...

Full description

Bibliographic Details
Main Authors: Dangovski, Rumen, Jing, Li, Nakov, Preslav, Tatalović, Mićo, Soljačić, Marin
Format: Article
Language:English
Published: MIT Press - Journals 2021
Online Access:https://hdl.handle.net/1721.1/132374