A Multi-Granular Aggregation-Enhanced Knowledge Graph Representation for Recommendation

Knowledge graph (KG) helps to improve the accuracy, diversity, and interpretability of a recommender systems. KG has been applied in recommendation systems, exploiting graph neural networks (GNNs), but most existing recommendation models based on GNNs ignore the influence of node types and the loss...

Full description

Bibliographic Details
Main Authors: Xi Liu, Rui Song, Yuhang Wang, Hao Xu
Format: Article
Language:English
Published: MDPI AG 2022-04-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/13/5/229
Description
Summary:Knowledge graph (KG) helps to improve the accuracy, diversity, and interpretability of a recommender systems. KG has been applied in recommendation systems, exploiting graph neural networks (GNNs), but most existing recommendation models based on GNNs ignore the influence of node types and the loss of information during aggregation. In this paper, we propose a new model, named <i>A</i><i>Multi-Granular Aggregation-Enhanced Knowledge Graph Representation for Recommendation</i> (MAKR), that relieves the sparsity of the network and overcomes the limitation of information loss of the traditional GNN recommendation model. Specifically, we propose a new graph, named the <i>Improved Collaborative Knowledge Graph</i> (ICKG), that integrates user–item interaction and a knowledge graph into a huge heterogeneous network, divides the nodes in the heterogeneous network into three categories—users, items, and entities, and connects the edges according to the similarity between the users and items so as to enhance the high-order connectivity of the graph. In addition, we used attention mechanisms, the factorization machine (FM), and transformer (Trm) algorithms to aggregate messages from multi-granularity and different types to improve the representation ability of the model. The empirical results of three public benchmarks showed that MAKR outperformed state-of-the-art methods such as Neural FM, RippleNet, and KGAT.
ISSN:2078-2489