Bangla-BERT: Transformer-Based Efficient Model for Transfer Learning and Language Understanding
The advent of pre-trained language models has directed a new era of Natural Language Processing (NLP), enabling us to create powerful language models. Among these models, Transformer-based models like BERT have grown in popularity due to their cutting-edge effectiveness. However, these models heavil...
Main Authors: | M. Kowsher, Abdullah As Sami, Nusrat Jahan Prottasha, Mohammad Shamsul Arefin, Pranab Kumar Dhar, Takeshi Koshiba |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9852438/ |
Similar Items
-
Transfer Learning for Sentiment Analysis Using BERT Based Supervised Fine-Tuning
by: Nusrat Jahan Prottasha, et al.
Published: (2022-05-01) -
An Enhanced Neural Word Embedding Model for Transfer Learning
by: Md. Kowsher, et al.
Published: (2022-03-01) -
An ensemble novel architecture for Bangla Mathematical Entity Recognition (MER) using transformer based learning
by: Tanjim Taharat Aurpa, et al.
Published: (2024-02-01) -
Banner: A Cost-Sensitive Contextualized Model for Bangla Named Entity Recognition
by: Imranul Ashrafi, et al.
Published: (2020-01-01) -
Compilation, Analysis and Application of a Comprehensive Bangla Corpus KUMono
by: Aysha Akther, et al.
Published: (2022-01-01)