Deep Contextualized Self-training for Low Resource Dependency Parsing
Neural dependency parsing has proven very effective, achieving state-of-the-art results on numerous domains and languages. Unfortunately, it requires large amounts of labeled data, which is costly and laborious to create. In this paper we propose a self-training algorithm that alleviates this annota...
Main Authors: | Rotman, Guy, Reichart, Roi |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2019-11-01
|
Series: | Transactions of the Association for Computational Linguistics |
Online Access: | https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00294 |
Similar Items
-
Perturbation Based Learning for Structured NLP Tasks with Application to Dependency Parsing
by: Doitch, Amichay, et al.
Published: (2019-11-01) -
Model Compression for Domain Adaptation through Causal Effect Estimation
by: Guy Rotman, et al.
Published: (2021-01-01) -
Scene Graph Parsing as Dependency Parsing
by: Wang, Yu-Siang, et al.
Published: (2018) -
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing
by: Schuster, Tal, et al.
Published: (2020) -
Improving Low-resource Dependency Parsing Using Multi-strategy Data Augmentation
by: XIAN Yan-tuan, GAO Fan-ya, XIANG Yan, YU Zheng-tao, WANG Jian
Published: (2022-01-01)