Molding CNNs for text: Non-linear, non-consecutive convolutions

The success of deep learning often derives from well-chosen operational building blocks. In this work, we revise the temporal convolution operation in CNNs to better adapt it to text processing. Instead of concatenating word representations, we appeal to tensor algebra and u...

Ausführliche Beschreibung

Bibliographische Detailangaben
Hauptverfasser: Lei, Tao, Barzilay, Regina, Jaakkola, Tommi S.
Weitere Verfasser: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Artikel
Sprache:en_US
Veröffentlicht: Association for Computational Linguistics 2017
Online Zugang:http://hdl.handle.net/1721.1/110753
https://orcid.org/0000-0003-4644-3088
https://orcid.org/0000-0002-2921-8201
https://orcid.org/0000-0002-2199-0379