Transformers acceleration on autoNLP document classification

Unsupervised pre-training has been widely used in the field of Natural Language Processing, by training a huge network with unsupervised prediction tasks, one of the representatives is the BERT model. BERT has achieved great success in various NLP downstream tasks by reaching state-of-the-art result...

Full description

Bibliographic Details
Main Author: Cao, Hannan
Other Authors: Sinno Jialin Pan
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2020
Subjects:
Online Access:https://hdl.handle.net/10356/138506