Language modeling and bidirectional coders representations: an overview of key technologies
The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of...
Main Author: | D. I. Kachkou |
---|---|
Format: | Article |
Language: | Russian |
Published: |
The United Institute of Informatics Problems of the National Academy of Sciences of Belarus
2021-01-01
|
Series: | Informatika |
Subjects: | |
Online Access: | https://inf.grid.by/jour/article/view/1080 |
Similar Items
-
PDHS: Pattern-Based Deep Hate Speech Detection With Improved Tweet Representation
by: P. Sharmila, et al.
Published: (2022-01-01) -
Applying the language acquisition model to the solution small language processing tasks
by: Dz. I. Kachkou
Published: (2022-03-01) -
Evaluating Deep Learning Techniques for Natural Language Inference
by: Petros Eleftheriadis, et al.
Published: (2023-02-01) -
Bidirectional encoders to state-of-the-art: a review of BERT and its transformative impact on natural language processing
by: Раджеш Гупта
Published: (2024-03-01) -
Language Representation Models: An Overview
by: Thorben Schomacker, et al.
Published: (2021-10-01)