Task-Optimized Word Embeddings for Text Classification Representations

Word embeddings have introduced a compact and efficient way of representing text for further downstream natural language processing (NLP) tasks. Most word embedding algorithms are optimized at the word level. However, many NLP applications require text representations of groups of words, like senten...

Full description

Bibliographic Details
Main Authors: Sukrat Gupta, Teja Kanchinadam, Devin Conathan, Glenn Fung
Format: Article
Language:English
Published: Frontiers Media S.A. 2020-01-01
Series:Frontiers in Applied Mathematics and Statistics
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fams.2019.00067/full