CEG: A joint model for causal commonsense events enhanced story ending generation
With the success of pre-trained language models, the performance of story ending generation has been dramatically improved while remaining challenging due to the lack of commonsense reasoning ability. Most previous works mainly focus on using commonsense knowledge to enhance the implicit correlation...
Main Authors: | Yushi Zhang, Yan Yang, Ming Gu, Feng Gao, Chengcai Chen, Liang He |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2023-01-01
|
Series: | PLoS ONE |
Online Access: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10204949/?tool=EBI |
Similar Items
-
CEG: A joint model for causal commonsense events enhanced story ending generation.
by: Yushi Zhang, et al.
Published: (2023-01-01) -
CLICK: Integrating Causal Inference and Commonsense Knowledge Incorporation for Counterfactual Story Generation
by: Dandan Li, et al.
Published: (2023-10-01) -
A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
by: Guan, Jian, et al.
Published: (2020-07-01) -
A commonsense approach to story understanding
by: Williams, Bryan Michael
Published: (2018) -
Impacto estratégico dos fatores macroambientais no desempenho de concessões de serviços públicos: a CEG e a CEG RIO na ótica relacional Strategic impact of macro-environmental factors on the performance of public utility concessions: CEG and CEG RIO from a relational perspective
by: Sérgio A. P. Bastos, et al.
Published: (2007-08-01)