Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

<jats:p> Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designed to model lon...

Full description

Bibliographic Details
Main Authors: Dangovski, Rumen, Jing, Li, Nakov, Preslav, Tatalovic, Mico, Soljacic, Marin
Other Authors: Massachusetts Institute of Technology. Research Laboratory of Electronics
Format: Article
Language:English
Published: MIT Press - Journals 2022
Online Access:https://hdl.handle.net/1721.1/132374.2