Capturing Greater Context for Question Generation
Automatic question generation can benefit many applications ranging from dialogue systems to reading comprehension. While questions are often asked with respect to long documents, there are many challenges with modeling such long documents. Many existing techniques generate questions by effectively...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Association for the Advancement of Artificial Intelligence (AAAI)
2020
|
Online Access: | https://hdl.handle.net/1721.1/128714 |
_version_ | 1826189971458883584 |
---|---|
author | Tuan, Luu Anh Shah, Darsh J.(Darsh Jaidip) Barzilay, Regina |
author2 | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
author_facet | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Tuan, Luu Anh Shah, Darsh J.(Darsh Jaidip) Barzilay, Regina |
author_sort | Tuan, Luu Anh |
collection | MIT |
description | Automatic question generation can benefit many applications ranging from dialogue systems to reading comprehension. While questions are often asked with respect to long documents, there are many challenges with modeling such long documents. Many existing techniques generate questions by effectively looking at one sentence at a time, leading to questions that are easy and not reflective of the human process of question generation. Our goal is to incorporate interactions across multiple sentences to generate realistic questions for long documents. In order to link a broad document context to the target answer, we represent the relevant context via a multi-stage attention mechanism, which forms the foundation of a sequence to sequence model. We outperform state-of-the-art methods on question generation on three question-answering datasets - SQuAD, MS MARCO and NewsQA. |
first_indexed | 2024-09-23T08:33:02Z |
format | Article |
id | mit-1721.1/128714 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T08:33:02Z |
publishDate | 2020 |
publisher | Association for the Advancement of Artificial Intelligence (AAAI) |
record_format | dspace |
spelling | mit-1721.1/1287142022-09-23T12:50:59Z Capturing Greater Context for Question Generation Tuan, Luu Anh Shah, Darsh J.(Darsh Jaidip) Barzilay, Regina Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Automatic question generation can benefit many applications ranging from dialogue systems to reading comprehension. While questions are often asked with respect to long documents, there are many challenges with modeling such long documents. Many existing techniques generate questions by effectively looking at one sentence at a time, leading to questions that are easy and not reflective of the human process of question generation. Our goal is to incorporate interactions across multiple sentences to generate realistic questions for long documents. In order to link a broad document context to the target answer, we represent the relevant context via a multi-stage attention mechanism, which forms the foundation of a sequence to sequence model. We outperform state-of-the-art methods on question generation on three question-answering datasets - SQuAD, MS MARCO and NewsQA. DSO (Grant DSOCL18002) 2020-12-02T20:57:13Z 2020-12-02T20:57:13Z 2020-04 2020-12-01T17:52:35Z Article http://purl.org/eprint/type/ConferencePaper 2374-3468 2159-5399 https://hdl.handle.net/1721.1/128714 Tuan, Luu Anh et al. "Capturing Greater Context for Question Generation." Proceedings of the AAAI Conference on Artificial Intelligence 34, 5 (April 2020): 9065-9072 © 2020 Association for the Advancement of Artificial Intelligence en http://dx.doi.org/10.1609/aaai.v34i05.6440 Proceedings of the AAAI Conference on Artificial Intelligence Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Association for the Advancement of Artificial Intelligence (AAAI) arXiv |
spellingShingle | Tuan, Luu Anh Shah, Darsh J.(Darsh Jaidip) Barzilay, Regina Capturing Greater Context for Question Generation |
title | Capturing Greater Context for Question Generation |
title_full | Capturing Greater Context for Question Generation |
title_fullStr | Capturing Greater Context for Question Generation |
title_full_unstemmed | Capturing Greater Context for Question Generation |
title_short | Capturing Greater Context for Question Generation |
title_sort | capturing greater context for question generation |
url | https://hdl.handle.net/1721.1/128714 |
work_keys_str_mv | AT tuanluuanh capturinggreatercontextforquestiongeneration AT shahdarshjdarshjaidip capturinggreatercontextforquestiongeneration AT barzilayregina capturinggreatercontextforquestiongeneration |