Sequence-to-sequence pretraining for a less-resourced Slovenian language
IntroductionLarge pretrained language models have recently conquered the area of natural language processing. As an alternative to predominant masked language modeling introduced in BERT, the T5 model has introduced a more general training objective, namely sequence to sequence transformation, which...
Main Authors: | Matej Ulčar, Marko Robnik-Šikonja |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2023-03-01
|
Series: | Frontiers in Artificial Intelligence |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/frai.2023.932519/full |
Similar Items
-
Research Progress on Vision–Language Multimodal Pretraining Model Technology
by: Huansha Wang, et al.
Published: (2022-10-01) -
Automatic Taxonomy Classification by Pretrained Language Model
by: Ayato Kuwana, et al.
Published: (2021-10-01) -
University Student Dropout Prediction Using Pretrained Language Models
by: Hyun-Sik Won, et al.
Published: (2023-06-01) -
Generative pretrained transformer 4: an innovative approach to facilitate value-based healthcare
by: Han Lyu, et al.
Published: (2024-02-01) -
Automatic Component Prediction for Issue Reports Using Fine-Tuned Pretrained Language Models
by: Dae-Sung Wang, et al.
Published: (2022-01-01)