Deep Contextualized Self-training for Low Resource Dependency Parsing
Neural dependency parsing has proven very effective, achieving state-of-the-art results on numerous domains and languages. Unfortunately, it requires large amounts of labeled data, which is costly and laborious to create. In this paper we propose a self-training algorithm that alleviates this annota...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2019-11-01
|
Series: | Transactions of the Association for Computational Linguistics |
Online Access: | https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00294 |