EREC: Enhanced Language Representations with Event Chains
The natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding capability of various natural language tasks. F...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-12-01
|
Series: | Information |
Subjects: | |
Online Access: | https://www.mdpi.com/2078-2489/13/12/582 |
_version_ | 1827638445027098624 |
---|---|
author | Huajie Wang Yinglin Wang |
author_facet | Huajie Wang Yinglin Wang |
author_sort | Huajie Wang |
collection | DOAJ |
description | The natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding capability of various natural language tasks. For some specific tasks, the capability of the model can be enhanced by introducing external knowledge. In fact, these methods, such as ERNIE, have been proposed for integrating knowledge graphs into BERT models, which significantly enhanced its capabilities in related tasks such as entity recognition. However, for two types of tasks, commonsense causal reasoning and predicting the ending of stories, few previous studies have combined model modification and process optimization to integrate external knowledge. Therefore, referring to ERNIE, in this paper, we propose <b>e</b>nhanced language <b>r</b>epresentation with <b>e</b>vent <b>c</b>hains (EREC), which focuses on keywords in the text corpus and their implied relations. Event chains are integrated into EREC as external knowledge. Furthermore, various graph networks are used to generate embeddings and to associate keywords in the corpus. Finally, via multi-task training, external knowledge is integrated into the model generated in the pretraining stage so as to enhance the effect of the model in downstream tasks. The experimental process of the EREC model is carried out with a three-stage design, and the experimental results show that EREC has a deeper understanding of the causal relationship and event relationship contained in the text by integrating the event chains, and it achieved significant improvements on two specific tasks. |
first_indexed | 2024-03-09T16:17:29Z |
format | Article |
id | doaj.art-2aa6e38eedac4b8ba8179085f08b98a8 |
institution | Directory Open Access Journal |
issn | 2078-2489 |
language | English |
last_indexed | 2024-03-09T16:17:29Z |
publishDate | 2022-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Information |
spelling | doaj.art-2aa6e38eedac4b8ba8179085f08b98a82023-11-24T15:37:31ZengMDPI AGInformation2078-24892022-12-01131258210.3390/info13120582EREC: Enhanced Language Representations with Event ChainsHuajie Wang0Yinglin Wang1School of Information Management and Engineering, Shanghai University of Finance and Economics, Shanghai 200433, ChinaSchool of Information Management and Engineering, Shanghai University of Finance and Economics, Shanghai 200433, ChinaThe natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding capability of various natural language tasks. For some specific tasks, the capability of the model can be enhanced by introducing external knowledge. In fact, these methods, such as ERNIE, have been proposed for integrating knowledge graphs into BERT models, which significantly enhanced its capabilities in related tasks such as entity recognition. However, for two types of tasks, commonsense causal reasoning and predicting the ending of stories, few previous studies have combined model modification and process optimization to integrate external knowledge. Therefore, referring to ERNIE, in this paper, we propose <b>e</b>nhanced language <b>r</b>epresentation with <b>e</b>vent <b>c</b>hains (EREC), which focuses on keywords in the text corpus and their implied relations. Event chains are integrated into EREC as external knowledge. Furthermore, various graph networks are used to generate embeddings and to associate keywords in the corpus. Finally, via multi-task training, external knowledge is integrated into the model generated in the pretraining stage so as to enhance the effect of the model in downstream tasks. The experimental process of the EREC model is carried out with a three-stage design, and the experimental results show that EREC has a deeper understanding of the causal relationship and event relationship contained in the text by integrating the event chains, and it achieved significant improvements on two specific tasks.https://www.mdpi.com/2078-2489/13/12/582external knowledgeevent chainscommonsense causal reasoningstory ending prediction |
spellingShingle | Huajie Wang Yinglin Wang EREC: Enhanced Language Representations with Event Chains Information external knowledge event chains commonsense causal reasoning story ending prediction |
title | EREC: Enhanced Language Representations with Event Chains |
title_full | EREC: Enhanced Language Representations with Event Chains |
title_fullStr | EREC: Enhanced Language Representations with Event Chains |
title_full_unstemmed | EREC: Enhanced Language Representations with Event Chains |
title_short | EREC: Enhanced Language Representations with Event Chains |
title_sort | erec enhanced language representations with event chains |
topic | external knowledge event chains commonsense causal reasoning story ending prediction |
url | https://www.mdpi.com/2078-2489/13/12/582 |
work_keys_str_mv | AT huajiewang erecenhancedlanguagerepresentationswitheventchains AT yinglinwang erecenhancedlanguagerepresentationswitheventchains |