Better conditioning on context for natural language processing
<p>Learning distributed representations of natural language has become a common practice for Natural Language Processing (NLP). Non-contextual embeddings map each token in the vocabulary to a low-dimensional real-valued vector. Although these representations perform competitively on word-level...
Autor principal: | |
---|---|
Outros Autores: | |
Formato: | Tese |
Idioma: | English |
Publicado em: |
2022
|
Assuntos: |