Improving Low-Resource Chinese Named Entity Recognition Using Bidirectional Encoder Representation from Transformers and Lexicon Adapter
Due to their individual advantages, the integration of lexicon information and pre-trained models like BERT has been widely adopted in Chinese sequence labeling tasks. However, given their high demand for training data, efforts have been made to enhance their performance in low-resource scenarios. C...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-09-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/19/10759 |