Better conditioning on context for natural language processing
<p>Learning distributed representations of natural language has become a common practice for Natural Language Processing (NLP). Non-contextual embeddings map each token in the vocabulary to a low-dimensional real-valued vector. Although these representations perform competitively on word-level...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Language: | English |
Published: |
2022
|
Subjects: |