Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing

With the advent of pre-trained language models, many natural language processing tasks in various languages have achieved great success. Although some research has been conducted on fine-tuning BERT-based models for syntactic parsing, and several Arabic pre-trained models have been developed, no att...

Full description

Bibliographic Details
Main Authors: Sharefah Al-Ghamdi, Hend Al-Khalifa, Abdulmalik Al-Salman
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/13/7/4225

Similar Items