A Lite Romanian BERT: ALR-BERT
Large-scale pre-trained language representation and its promising performance in various downstream applications have become an area of interest in the field of natural language processing (NLP). There has been huge interest in further increasing the model’s size in order to outperform the best prev...
Main Authors: | Dragoş Constantin Nicolae, Rohan Kumar Yadav, Dan Tufiş |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-04-01
|
Series: | Computers |
Subjects: | |
Online Access: | https://www.mdpi.com/2073-431X/11/4/57 |
Similar Items
-
Pre-Training MLM Using Bert for the Albanian Language
by: Kryeziu Labehat, et al.
Published: (2023-06-01) -
Bangla-BERT: Transformer-Based Efficient Model for Transfer Learning and Language Understanding
by: M. Kowsher, et al.
Published: (2022-01-01) -
Extracting patient lifestyle characteristics from Dutch clinical text with BERT models
by: Hielke Muizelaar, et al.
Published: (2024-06-01) -
Text Mining of Stocktwits Data for Predicting Stock Prices
by: Mukul Jaggi, et al.
Published: (2021-02-01) -
SSMBERT: A Space Science Mission Requirement Classification Method Based on BERT
by: Yiming Zhu, et al.
Published: (2024-12-01)