Learn a prior question-aware feature for machine reading comprehension
Machine reading comprehension aims to train machines to comprehend a given context and then answer a series of questions according to their understanding of the context. It is the cornerstone of conversational reading comprehension and question answering tasks. Recently, researches of Machine readin...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-12-01
|
Series: | Frontiers in Physics |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fphy.2022.1085102/full |
_version_ | 1797979413272330240 |
---|---|
author | Yu Zhang Bo Shen Xing Cao |
author_facet | Yu Zhang Bo Shen Xing Cao |
author_sort | Yu Zhang |
collection | DOAJ |
description | Machine reading comprehension aims to train machines to comprehend a given context and then answer a series of questions according to their understanding of the context. It is the cornerstone of conversational reading comprehension and question answering tasks. Recently, researches of Machine reading comprehension have experienced considerable development with more and more semantic features being incorporated into end-to-end neural network models, such as pre-trained word embedding features, syntactic features, context and question interaction features, and so on. However, these methods neglect the understanding of the question itself and the information sought by the question. In this paper, we design an auxiliary question-and-answer matching task to learn the features of different types of questions and then integrate these learned features into a classical Machine reading comprehension model architecture to improve its ability to comprehend the questions. Our auxiliary task relies on a simple Question-Answer Pairs dataset generated by ourselves. And we incorporate the learned question-type information into the Machine reading comprehension model by prior attention mechanism. The model we proposed is named PrA-MRC (Prior Attention on Machine reading comprehension). Empirical results show that our approach is effective and interpretable. Our Question-Answer Pairs model achieves an accuracy of 84% and our PrA-MRC model outperforms the baseline model by +0.7 EM and +1.1 F1 on the SQuAD dataset. |
first_indexed | 2024-04-11T05:38:36Z |
format | Article |
id | doaj.art-ea10137435454bda87f5c4c25bfd0083 |
institution | Directory Open Access Journal |
issn | 2296-424X |
language | English |
last_indexed | 2024-04-11T05:38:36Z |
publishDate | 2022-12-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Physics |
spelling | doaj.art-ea10137435454bda87f5c4c25bfd00832022-12-22T10:04:06ZengFrontiers Media S.A.Frontiers in Physics2296-424X2022-12-011010.3389/fphy.2022.10851021085102Learn a prior question-aware feature for machine reading comprehensionYu Zhang0Bo Shen1Xing Cao2School of Electronic and Information, Beijing Jiaotong University, Beijing, ChinaKey Laboratory of Communication and Information Systems, Beijing Municipal Commission of Education, Beijing Jiaotong University, Beijing, ChinaSchool of Electronic and Information, Beijing Jiaotong University, Beijing, ChinaMachine reading comprehension aims to train machines to comprehend a given context and then answer a series of questions according to their understanding of the context. It is the cornerstone of conversational reading comprehension and question answering tasks. Recently, researches of Machine reading comprehension have experienced considerable development with more and more semantic features being incorporated into end-to-end neural network models, such as pre-trained word embedding features, syntactic features, context and question interaction features, and so on. However, these methods neglect the understanding of the question itself and the information sought by the question. In this paper, we design an auxiliary question-and-answer matching task to learn the features of different types of questions and then integrate these learned features into a classical Machine reading comprehension model architecture to improve its ability to comprehend the questions. Our auxiliary task relies on a simple Question-Answer Pairs dataset generated by ourselves. And we incorporate the learned question-type information into the Machine reading comprehension model by prior attention mechanism. The model we proposed is named PrA-MRC (Prior Attention on Machine reading comprehension). Empirical results show that our approach is effective and interpretable. Our Question-Answer Pairs model achieves an accuracy of 84% and our PrA-MRC model outperforms the baseline model by +0.7 EM and +1.1 F1 on the SQuAD dataset.https://www.frontiersin.org/articles/10.3389/fphy.2022.1085102/fullquestion and answer pairsprior attention mechanismtransfer learningmachine reading comprehension (MRC)nature language processing (NLP) |
spellingShingle | Yu Zhang Bo Shen Xing Cao Learn a prior question-aware feature for machine reading comprehension Frontiers in Physics question and answer pairs prior attention mechanism transfer learning machine reading comprehension (MRC) nature language processing (NLP) |
title | Learn a prior question-aware feature for machine reading comprehension |
title_full | Learn a prior question-aware feature for machine reading comprehension |
title_fullStr | Learn a prior question-aware feature for machine reading comprehension |
title_full_unstemmed | Learn a prior question-aware feature for machine reading comprehension |
title_short | Learn a prior question-aware feature for machine reading comprehension |
title_sort | learn a prior question aware feature for machine reading comprehension |
topic | question and answer pairs prior attention mechanism transfer learning machine reading comprehension (MRC) nature language processing (NLP) |
url | https://www.frontiersin.org/articles/10.3389/fphy.2022.1085102/full |
work_keys_str_mv | AT yuzhang learnapriorquestionawarefeatureformachinereadingcomprehension AT boshen learnapriorquestionawarefeatureformachinereadingcomprehension AT xingcao learnapriorquestionawarefeatureformachinereadingcomprehension |