A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neural language generation models (e.g., GPT-2) still suffer from repetition, logic conflicts, and lack of l...

Full description

Bibliographic Details
Main Authors: Guan, Jian, Huang, Fei, Zhao, Zhihao, Zhu, Xiaoyan, Huang, Minlie
Format: Article
Language:English
Published: The MIT Press 2020-07-01
Series:Transactions of the Association for Computational Linguistics
Online Access:https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00302