Learned Text Representation for Amharic Information Retrieval and Natural Language Processing

Over the past few years, word embeddings and bidirectional encoder representations from transformers (BERT) models have brought better solutions to learning text representations for natural language processing (NLP) and other tasks. Many NLP applications rely on pre-trained text representations, lea...

Full description

Bibliographic Details
Main Authors: Tilahun Yeshambel, Josiane Mothe, Yaregal Assabie
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/14/3/195