End-to-End Transformer-Based Models in Textual-Based NLP

Transformer architectures are highly expressive because they use self-attention mechanisms to encode long-range dependencies in the input sequences. In this paper, we present a literature review on Transformer-based (TB) models, providing a detailed overview of each model in comparison to the Transf...

Full description

Bibliographic Details
Main Authors: Abir Rahali, Moulay A. Akhloufi
Format: Article
Language:English
Published: MDPI AG 2023-01-01
Series:AI
Subjects:
Online Access:https://www.mdpi.com/2673-2688/4/1/4