Multi-hop question generation with knowledge graph-enhanced language model

The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence a...

Full description

Bibliographic Details
Main Authors: Li, Zhenping, Cao, Zhen, Li, Pengfei, Zhong, Yong, Li, Shaobo
Other Authors: School of Electrical and Electronic Engineering
Format: Journal Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/169522
_version_ 1826124138484334592
author Li, Zhenping
Cao, Zhen
Li, Pengfei
Zhong, Yong
Li, Shaobo
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Li, Zhenping
Cao, Zhen
Li, Pengfei
Zhong, Yong
Li, Shaobo
author_sort Li, Zhenping
collection NTU
description The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence across multiple sentences. To address these challenges, a knowledge graph-enhanced language model (KGEL) has been developed to imitate human reasoning for multi-hop questions.The initial step in KGEL involves encoding the input sentence with a pre-trained GPT-2 language model to obtain a comprehensive semantic context representation. Next, a knowledge graph is constructed using the entities identified within the context. The critical information in the graph that is related to the answer is then utilized to update the context representations through an answer-aware graph attention network (GAT). Finally, the multi-head attention generation module (MHAG) is performed over the updated latent representations of the context to generate coherent questions. Human evaluations demonstrate that KGEL generates more logical and fluent multi-hop questions compared to GPT-2. Furthermore, KGEL outperforms five prominent baselines in automatic evaluations, with a BLEU-4 score that is 27% higher than that of GPT-2.
first_indexed 2024-10-01T06:15:46Z
format Journal Article
id ntu-10356/169522
institution Nanyang Technological University
language English
last_indexed 2024-10-01T06:15:46Z
publishDate 2023
record_format dspace
spelling ntu-10356/1695222023-07-21T15:40:39Z Multi-hop question generation with knowledge graph-enhanced language model Li, Zhenping Cao, Zhen Li, Pengfei Zhong, Yong Li, Shaobo School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Multi-Hop Question Generation Graph Neural Network The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence across multiple sentences. To address these challenges, a knowledge graph-enhanced language model (KGEL) has been developed to imitate human reasoning for multi-hop questions.The initial step in KGEL involves encoding the input sentence with a pre-trained GPT-2 language model to obtain a comprehensive semantic context representation. Next, a knowledge graph is constructed using the entities identified within the context. The critical information in the graph that is related to the answer is then utilized to update the context representations through an answer-aware graph attention network (GAT). Finally, the multi-head attention generation module (MHAG) is performed over the updated latent representations of the context to generate coherent questions. Human evaluations demonstrate that KGEL generates more logical and fluent multi-hop questions compared to GPT-2. Furthermore, KGEL outperforms five prominent baselines in automatic evaluations, with a BLEU-4 score that is 27% higher than that of GPT-2. Published version This work was supported by the AI industrial technology innovation platform of Sichuan Province, grant number “2020ZHCG0002”. 2023-07-21T06:46:58Z 2023-07-21T06:46:58Z 2023 Journal Article Li, Z., Cao, Z., Li, P., Zhong, Y. & Li, S. (2023). Multi-hop question generation with knowledge graph-enhanced language model. Applied Sciences, 13(9), 5765-. https://dx.doi.org/10.3390/app13095765 2076-3417 https://hdl.handle.net/10356/169522 10.3390/app13095765 2-s2.0-85159323152 9 13 5765 en Applied Sciences © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). application/pdf
spellingShingle Engineering::Electrical and electronic engineering
Multi-Hop Question Generation
Graph Neural Network
Li, Zhenping
Cao, Zhen
Li, Pengfei
Zhong, Yong
Li, Shaobo
Multi-hop question generation with knowledge graph-enhanced language model
title Multi-hop question generation with knowledge graph-enhanced language model
title_full Multi-hop question generation with knowledge graph-enhanced language model
title_fullStr Multi-hop question generation with knowledge graph-enhanced language model
title_full_unstemmed Multi-hop question generation with knowledge graph-enhanced language model
title_short Multi-hop question generation with knowledge graph-enhanced language model
title_sort multi hop question generation with knowledge graph enhanced language model
topic Engineering::Electrical and electronic engineering
Multi-Hop Question Generation
Graph Neural Network
url https://hdl.handle.net/10356/169522
work_keys_str_mv AT lizhenping multihopquestiongenerationwithknowledgegraphenhancedlanguagemodel
AT caozhen multihopquestiongenerationwithknowledgegraphenhancedlanguagemodel
AT lipengfei multihopquestiongenerationwithknowledgegraphenhancedlanguagemodel
AT zhongyong multihopquestiongenerationwithknowledgegraphenhancedlanguagemodel
AT lishaobo multihopquestiongenerationwithknowledgegraphenhancedlanguagemodel