Enhance QANet by BERT for machine reading comprehension

The aim of this dissertation is to study the implementation of QANet on SQuAD for the machine reading comprehension task, and enhance QANet by introducing the contextual representation BERT architecture for the task. First of all, several techniques employed to capture the dependencies among words...

Full description

Bibliographic Details
Main Author: Yin, Bo
Other Authors: Chen Lihui
Format: Thesis
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/78809
_version_ 1826114476000149504
author Yin, Bo
author2 Chen Lihui
author_facet Chen Lihui
Yin, Bo
author_sort Yin, Bo
collection NTU
description The aim of this dissertation is to study the implementation of QANet on SQuAD for the machine reading comprehension task, and enhance QANet by introducing the contextual representation BERT architecture for the task. First of all, several techniques employed to capture the dependencies among words in the question answering task are reviewed, including Recurrent Neural Network and the attention mechanism. Then a review is conducted about the heated NLP downstream task: question answering and its subclasses, which contains the most important machine reading comprehension task. It follows by a review of two categories of reading comprehension models. Then the word representation methods are introduced in detail and how they evolved across the whole time is provided. The architecture of one reading comprehension model QANet is reviewed with introduction to each layer. The most important part of this dissertation is about understanding the implementation of the QANet and finding the best performance of the original QANet by tuning hyperparameters. It follows by applying the state-of-art contextual word representation method BERT to enhance QANet. The experiment results depict that BERT improve F1 score on validation dataset by 7.2 percent. Ablation study has been conducted to verify how those model components of QANet can contribute to the overall performance of the model, such as the application of attention mechanism to capture the global interactions. The QANet with BERT model is also compared to BiDAF with BERT model to demonstrate that QANet with BERT may have better performance than another model using RNNs with BERT on some datasets.
first_indexed 2024-10-01T03:40:08Z
format Thesis
id ntu-10356/78809
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:40:08Z
publishDate 2019
record_format dspace
spelling ntu-10356/788092023-07-04T16:15:31Z Enhance QANet by BERT for machine reading comprehension Yin, Bo Chen Lihui School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering The aim of this dissertation is to study the implementation of QANet on SQuAD for the machine reading comprehension task, and enhance QANet by introducing the contextual representation BERT architecture for the task. First of all, several techniques employed to capture the dependencies among words in the question answering task are reviewed, including Recurrent Neural Network and the attention mechanism. Then a review is conducted about the heated NLP downstream task: question answering and its subclasses, which contains the most important machine reading comprehension task. It follows by a review of two categories of reading comprehension models. Then the word representation methods are introduced in detail and how they evolved across the whole time is provided. The architecture of one reading comprehension model QANet is reviewed with introduction to each layer. The most important part of this dissertation is about understanding the implementation of the QANet and finding the best performance of the original QANet by tuning hyperparameters. It follows by applying the state-of-art contextual word representation method BERT to enhance QANet. The experiment results depict that BERT improve F1 score on validation dataset by 7.2 percent. Ablation study has been conducted to verify how those model components of QANet can contribute to the overall performance of the model, such as the application of attention mechanism to capture the global interactions. The QANet with BERT model is also compared to BiDAF with BERT model to demonstrate that QANet with BERT may have better performance than another model using RNNs with BERT on some datasets. Master of Science (Signal Processing) 2019-06-28T07:27:34Z 2019-06-28T07:27:34Z 2019 Thesis http://hdl.handle.net/10356/78809 en 73 p. application/pdf
spellingShingle Engineering::Electrical and electronic engineering
Yin, Bo
Enhance QANet by BERT for machine reading comprehension
title Enhance QANet by BERT for machine reading comprehension
title_full Enhance QANet by BERT for machine reading comprehension
title_fullStr Enhance QANet by BERT for machine reading comprehension
title_full_unstemmed Enhance QANet by BERT for machine reading comprehension
title_short Enhance QANet by BERT for machine reading comprehension
title_sort enhance qanet by bert for machine reading comprehension
topic Engineering::Electrical and electronic engineering
url http://hdl.handle.net/10356/78809
work_keys_str_mv AT yinbo enhanceqanetbybertformachinereadingcomprehension