A quantum‐like approach for text generation from knowledge graphs

Abstract Recent text generation methods frequently learn node representations from graph‐based data via global or local aggregation, such as knowledge graphs. Since all nodes are connected directly, node global representation encoding enables direct communication between two distant nodes while disr...

Full description

Bibliographic Details
Main Authors: Jia Zhu, Xiaodong Ma, Zhihao Lin, Pasquale DeMeo
Format: Article
Language:English
Published: Wiley 2023-12-01
Series:CAAI Transactions on Intelligence Technology
Subjects:
Online Access:https://doi.org/10.1049/cit2.12178
Description
Summary:Abstract Recent text generation methods frequently learn node representations from graph‐based data via global or local aggregation, such as knowledge graphs. Since all nodes are connected directly, node global representation encoding enables direct communication between two distant nodes while disregarding graph topology. Node local representation encoding, which captures the graph structure, considers the connections between nearby nodes but misses out onlong‐range relations. A quantum‐like approach to learning better‐contextualised node embeddings is proposed using a fusion model that combines both encoding strategies. Our methods significantly improve on two graph‐to‐text datasets compared to state‐of‐the‐art models in various experiments.
ISSN:2468-2322