Towards Knowledge Enhanced Language Model for Machine Reading Comprehension
Machine reading comprehension is a crucial and challenging task in natural language processing (NLP). Recently, knowledge graph (KG) embedding has gained massive attention as it can effectively provide side information for downstream tasks. However, most previous knowledge-based models do not take i...
Main Authors: | Peizhu Gong, Jin Liu, Yihe Yang, Huihua He |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9292918/ |
Similar Items
-
Multigranularity Syntax Guidance with Graph Structure for Machine Reading Comprehension
by: Chuanyun Xu, et al.
Published: (2022-09-01) -
K-LM: Knowledge Augmenting in Language Models Within the Scholarly Domain
by: Vivek Kumar, et al.
Published: (2022-01-01) -
Machine Reading Comprehension Model Based on Fusion of Mixed Attention
by: Yanfeng Wang, et al.
Published: (2024-09-01) -
Legal element extraction method based on BERT reading comprehension framework
by: Hui HUANG, et al.
Published: (2021-11-01) -
An Idiom Reading Comprehension Model Based on Multi-Granularity Reasoning and Paraphrase Expansion
by: Yu Dai, et al.
Published: (2023-05-01)