A Lite Romanian BERT: ALR-BERT

Large-scale pre-trained language representation and its promising performance in various downstream applications have become an area of interest in the field of natural language processing (NLP). There has been huge interest in further increasing the model’s size in order to outperform the best prev...

Full description

Bibliographic Details
Main Authors: Dragoş Constantin Nicolae, Rohan Kumar Yadav, Dan Tufiş
Format: Article
Language:English
Published: MDPI AG 2022-04-01
Series:Computers
Subjects:
Online Access:https://www.mdpi.com/2073-431X/11/4/57