A Lite Romanian BERT: ALR-BERT
Large-scale pre-trained language representation and its promising performance in various downstream applications have become an area of interest in the field of natural language processing (NLP). There has been huge interest in further increasing the model’s size in order to outperform the best prev...
Main Authors: | Dragoş Constantin Nicolae, Rohan Kumar Yadav, Dan Tufiş |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-04-01
|
Series: | Computers |
Subjects: | |
Online Access: | https://www.mdpi.com/2073-431X/11/4/57 |
Similar Items
-
Pre-Training MLM Using Bert for the Albanian Language
by: Kryeziu Labehat, et al.
Published: (2023-06-01) -
Bangla-BERT: Transformer-Based Efficient Model for Transfer Learning and Language Understanding
by: M. Kowsher, et al.
Published: (2022-01-01) -
Text Mining of Stocktwits Data for Predicting Stock Prices
by: Mukul Jaggi, et al.
Published: (2021-02-01) -
Прогнозування пунктуації тексту на основі моделі BERT
by: C.В. Знахур, et al.
Published: (2020-03-01) -
An Empirical Comparison of Portuguese and Multilingual BERT Models for Auto-Classification of NCM Codes in International Trade
by: Roberta Rodrigues de Lima, et al.
Published: (2022-01-01)