What does Chinese BERT learn about syntactic knowledge?

Pre-trained language models such as Bidirectional Encoder Representations from Transformers (BERT) have been applied to a wide range of natural language processing (NLP) tasks and obtained significantly positive results. A growing body of research has investigated the reason why BERT is so efficient...

Deskribapen osoa

Xehetasun bibliografikoak
Egile Nagusiak: Jianyu Zheng, Ying Liu
Formatua: Artikulua
Hizkuntza:English
Argitaratua: PeerJ Inc. 2023-07-01
Saila:PeerJ Computer Science
Gaiak:
Sarrera elektronikoa:https://peerj.com/articles/cs-1478.pdf