Learned Text Representation for Amharic Information Retrieval and Natural Language Processing
Over the past few years, word embeddings and bidirectional encoder representations from transformers (BERT) models have brought better solutions to learning text representations for natural language processing (NLP) and other tasks. Many NLP applications rely on pre-trained text representations, lea...
Main Authors: | Tilahun Yeshambel, Josiane Mothe, Yaregal Assabie |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-03-01
|
Series: | Information |
Subjects: | |
Online Access: | https://www.mdpi.com/2078-2489/14/3/195 |
Similar Items
-
Amharic <i>Adhoc</i> Information Retrieval System Based on Morphological Features
by: Tilahun Yeshambel, et al.
Published: (2022-01-01) -
Semantic Role Labeling for Amharic Text Using Multiple Embeddings and Deep Neural Network
by: Bemnet Meresa Hailu, et al.
Published: (2023-01-01) -
A Text Abstraction Summary Model Based on BERT Word Embedding and Reinforcement Learning
by: Qicai Wang, et al.
Published: (2019-11-01) -
MaterialBERT for natural language processing of materials science texts
by: Michiko Yoshitake, et al.
Published: (2022-12-01) -
Bag of Words and Embedding Text Representation Methods for Medical Article Classification
by: Cichosz Paweł
Published: (2023-12-01)