Synthetic word embedding generation for downstream NLP task
Distributional word representation such as GloVe and BERT has garnered immense popularity and research interest in recent years due to their success in many downstream NLP applications. However, a major limitation of word embedding is its inability to handle unknown words. To make sense...
Main Author: | Hoang, Viet |
---|---|
Other Authors: | Chng Eng Siong |
Format: | Final Year Project (FYP) |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/153201 |
Similar Items
-
Design and development of a NLP demo system
by: Lynn Htet Aung
Published: (2022) -
Data-driven and NLP for long document learning representation
by: Ko, Seoyoon
Published: (2021) -
Evolving type-2 neural fuzzy inference system with embedded deep learning in dynamic portfolio rebalancing
by: Dinh Khoat Hoang Anh
Published: (2021) -
An embedded neuro-fuzzy architecture for explainable time series analysis
by: Xie, Chen
Published: (2022) -
Deep neural network compression for pixel-level vision tasks
by: He, Wei
Published: (2021)