Language modeling and bidirectional coders representations: an overview of key technologies

The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of...

Full description

Bibliographic Details
Main Author: D. I. Kachkou
Format: Article
Language:Russian
Published: The United Institute of Informatics Problems of the National Academy of Sciences of Belarus 2021-01-01
Series:Informatika
Subjects:
Online Access:https://inf.grid.by/jour/article/view/1080
_version_ 1797877222048006144
author D. I. Kachkou
author_facet D. I. Kachkou
author_sort D. I. Kachkou
collection DOAJ
description The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of natural language. Two key ideas implemented in BERT are knowledge transfer and attention mechanism. The model is designed to solve two problems on a large unlabeled data set and can reuse the identified language patterns for effective learning for a specific text processing problem. Architecture Transformer is based on the attention mechanism, i.e. it involves evaluation of relationships between input data tokens. In addition, the article notes strengths and weaknesses of BERT and the directions for further model improvement.
first_indexed 2024-04-10T02:13:43Z
format Article
id doaj.art-31f96994cc0845f687ee37f19a13222f
institution Directory Open Access Journal
issn 1816-0301
language Russian
last_indexed 2024-04-10T02:13:43Z
publishDate 2021-01-01
publisher The United Institute of Informatics Problems of the National Academy of Sciences of Belarus
record_format Article
series Informatika
spelling doaj.art-31f96994cc0845f687ee37f19a13222f2023-03-13T08:32:24ZrusThe United Institute of Informatics Problems of the National Academy of Sciences of BelarusInformatika1816-03012021-01-01174617210.37661/1816-0301-2020-17-4-61-72945Language modeling and bidirectional coders representations: an overview of key technologiesD. I. Kachkou0Belarusian State UniversityThe article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of natural language. Two key ideas implemented in BERT are knowledge transfer and attention mechanism. The model is designed to solve two problems on a large unlabeled data set and can reuse the identified language patterns for effective learning for a specific text processing problem. Architecture Transformer is based on the attention mechanism, i.e. it involves evaluation of relationships between input data tokens. In addition, the article notes strengths and weaknesses of BERT and the directions for further model improvement.https://inf.grid.by/jour/article/view/1080informaticsinformation technologylanguage modelsnatural language processingattention mechanismtransformer architecturemodel bert
spellingShingle D. I. Kachkou
Language modeling and bidirectional coders representations: an overview of key technologies
Informatika
informatics
information technology
language models
natural language processing
attention mechanism
transformer architecture
model bert
title Language modeling and bidirectional coders representations: an overview of key technologies
title_full Language modeling and bidirectional coders representations: an overview of key technologies
title_fullStr Language modeling and bidirectional coders representations: an overview of key technologies
title_full_unstemmed Language modeling and bidirectional coders representations: an overview of key technologies
title_short Language modeling and bidirectional coders representations: an overview of key technologies
title_sort language modeling and bidirectional coders representations an overview of key technologies
topic informatics
information technology
language models
natural language processing
attention mechanism
transformer architecture
model bert
url https://inf.grid.by/jour/article/view/1080
work_keys_str_mv AT dikachkou languagemodelingandbidirectionalcodersrepresentationsanoverviewofkeytechnologies