EREC: Enhanced Language Representations with Event Chains

The natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding capability of various natural language tasks. F...

Full description

Bibliographic Details
Main Authors: Huajie Wang, Yinglin Wang
Format: Article
Language:English
Published: MDPI AG 2022-12-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/13/12/582