Improving text mining in plant health domain with GAN and/or pre-trained language model

The Bidirectional Encoder Representations from Transformers (BERT) architecture offers a cutting-edge approach to Natural Language Processing. It involves two steps: 1) pre-training a language model to extract contextualized features and 2) fine-tuning for specific downstream tasks. Although pre-tra...

Full description

Bibliographic Details
Main Authors: Shufan Jiang, Stéphane Cormier, Rafael Angarita, Francis Rousseaux
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-02-01
Series:Frontiers in Artificial Intelligence
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frai.2023.1072329/full

Similar Items