Summary: | Knowledge graph (KG) is playing an important role in many artificial intelligence applications. Representation learning of KGs aims to project both entities and relations into a continuous low-dimensional space. The representation learning technique based on embedding has been used to implement the KG completion, which aims to predict potential triples (head, relation, and tail) in KG. Most current methods concentrate on learning representations based on triple information while ignoring integrating the textual knowledge and network topology of KG. This leads to ambiguous completions. To address this problem and implement more accurate KG completion, we propose a new representation learning model, TDN model, which integratedly embeds the information of triples, text descriptions, and network structure of KG in a low-dimensional vector space. The framework of TDN is defined and the methodology of implementing TDN embedding is explored. To verify the effectiveness of the proposed model, we evaluate TDN via the experiments of link prediction on the real-world datasets. The experimental results confirm the above claims and show that TDN-based embedding significantly outperforms other baselines.
|