Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing
We introduce a novel method for multilingual transfer that utilizes deep contextual embeddings, pretrained in an unsupervised fashion. While contextual embeddings have been shown to yield richer representations of meaning compared to their static counterparts, aligning them poses a challenge due to...
Main Authors: | Schuster, Tal, Ram, Ori, Barzilay, Regina, Globerson, Amir |
---|---|
Other Authors: | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics
2020
|
Online Access: | https://hdl.handle.net/1721.1/128715 |
Similar Items
-
Selective Sharing for Multilingual Dependency Parsing
by: Naseem, Tahira, et al.
Published: (2014) -
Implicit Cross-Lingual Word Embedding Alignment for Reference-Free Machine Translation Evaluation
by: Min Zhang, et al.
Published: (2023-01-01) -
English-Malay Cross-Lingual Emotion Detection In Tweets Using Word Embedding Alignment
by: Lim, Ying Hao
Published: (2023) -
Constituency Parsing by Cross-Lingual Delexicalization
by: Hour Kaing, et al.
Published: (2021-01-01) -
Zero-Shot Learning for Cross-Lingual News Sentiment Classification
by: Andraž Pelicon, et al.
Published: (2020-08-01)