Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing
With the advent of pre-trained language models, many natural language processing tasks in various languages have achieved great success. Although some research has been conducted on fine-tuning BERT-based models for syntactic parsing, and several Arabic pre-trained models have been developed, no att...
Main Authors: | Sharefah Al-Ghamdi, Hend Al-Khalifa, Abdulmalik Al-Salman |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-03-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/7/4225 |
Similar Items
-
A Drop-in Replacement for LR(1) Table-Driven Parsing
by: Michael Oudshoorn
Published: (2021-12-01) -
Hierarchical Clause Annotation: Building a Clause-Level Corpus for Semantic Parsing with Complex Sentences
by: Yunlong Fan, et al.
Published: (2023-08-01) -
Tunnel Parsing with Ambiguous Grammars
by: Handzhiyski Nikolay, et al.
Published: (2023-06-01) -
Natural language parsing systems /
by: Bolc, Leonard
Published: (1987) -
Parsing techniques : a practical guide /
by: Grune, Dick, 1939-, et al.
Published: (1990)