Sequence-to-sequence modeling for graph representation learning

Abstract We propose sequence-to-sequence architectures for graph representation learning in both supervised and unsupervised regimes. Our methods use recurrent neural networks to encode and decode information from graph-structured data. Recurrent neural networks require sequences, so we choose sever...

Full description

Bibliographic Details
Main Authors: Aynaz Taheri, Kevin Gimpel, Tanya Berger-Wolf
Format: Article
Language:English
Published: SpringerOpen 2019-08-01
Series:Applied Network Science
Subjects:
Online Access:http://link.springer.com/article/10.1007/s41109-019-0174-8