Leveraging linguistic knowledge to enhance low-resource NLP applications
Natural Language Processing (NLP) empowers computers to process and analyze vast amounts of text data. The introduction of pre-trained language models (PLMs) has significantly advanced NLP by incorporating deep learning algorithms, thereby enhancing the handling of natural language understanding (NL...
Main Author: | Zhu, Zixiao |
---|---|
Other Authors: | Mao Kezhi |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2025
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/182513 |
Similar Items
-
Llama2 self-improvement using memory-of-thought
by: Dong, Yuxiu
Published: (2024) -
Neural abstractive summarization: improvements at the sequence-level
by: Ravaut, Mathieu
Published: (2024) -
Leveraging large language models and BERT for log parsing and anomaly detection
by: Zhou, Yihan, et al.
Published: (2024) -
From ELIZA to ChatGPT: The Evolution of NLP and Financial Applications
by: Lo, Andrew W, et al.
Published: (2023) -
Towards explainable and semantically coherent claim extraction for an automated fact-checker
by: Yoswara, Jocelyn Valencia
Published: (2024)