Improving text mining in plant health domain with GAN and/or pre-trained language model
The Bidirectional Encoder Representations from Transformers (BERT) architecture offers a cutting-edge approach to Natural Language Processing. It involves two steps: 1) pre-training a language model to extract contextualized features and 2) fine-tuning for specific downstream tasks. Although pre-tra...
Main Authors: | Shufan Jiang, Stéphane Cormier, Rafael Angarita, Francis Rousseaux |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2023-02-01
|
Series: | Frontiers in Artificial Intelligence |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/frai.2023.1072329/full |
Similar Items
-
Learned Text Representation for Amharic Information Retrieval and Natural Language Processing
by: Tilahun Yeshambel, et al.
Published: (2023-03-01) -
Domain-Specific Language Model Pre-Training for Korean Tax Law Classification
by: Yeong Hyeon Gu, et al.
Published: (2022-01-01) -
A Study on Generating Webtoons Using Multilingual Text-to-Image Models
by: Kyungho Yu, et al.
Published: (2023-06-01) -
Realistic Image Generation from Text by Using BERT-Based Embedding
by: Sanghyuck Na, et al.
Published: (2022-03-01) -
Natural Language Processing in Diagnostic Texts from Nephropathology
by: Maximilian Legnar, et al.
Published: (2022-07-01)