Better conditioning on context for natural language processing
<p>Learning distributed representations of natural language has become a common practice for Natural Language Processing (NLP). Non-contextual embeddings map each token in the vocabulary to a low-dimensional real-valued vector. Although these representations perform competitively on word-level...
מחבר ראשי: | |
---|---|
מחברים אחרים: | |
פורמט: | Thesis |
שפה: | English |
יצא לאור: |
2022
|
נושאים: |