Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information
State-of-the-art semantic role labeling (SRL) performance has been achieved using neural network models by incorporating syntactic feature information such as dependency trees. In recent years, breakthroughs achieved using end-to-end neural network models have resulted in a state-of-the-art SRL perf...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/12/5995 |
_version_ | 1797490351528738816 |
---|---|
author | Jangseong Bae Changki Lee |
author_facet | Jangseong Bae Changki Lee |
author_sort | Jangseong Bae |
collection | DOAJ |
description | State-of-the-art semantic role labeling (SRL) performance has been achieved using neural network models by incorporating syntactic feature information such as dependency trees. In recent years, breakthroughs achieved using end-to-end neural network models have resulted in a state-of-the-art SRL performance even without syntactic features. With the advent of a language model called bidirectional encoder representations from transformers (BERT), another breakthrough was witnessed. Even though the semantic information of each word constituting a sentence is important in determining the meaning of a word, previous studies regarding the end-to-end neural network method did not utilize semantic information. In this study, we propose a BERT-based SRL model that uses simple semantic information without syntactic feature information. To obtain the latter, we used PropBank, which described the relational information between predicates and arguments. In addition, text-originated feature information obtained from the training text data was utilized. Our proposed model achieved state-of-the-art results on both Korean PropBank and CoNLL-2009 English benchmarks. |
first_indexed | 2024-03-10T00:30:46Z |
format | Article |
id | doaj.art-a37b28465292499f9f4767b8e542d8f5 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-10T00:30:46Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-a37b28465292499f9f4767b8e542d8f52023-11-23T15:26:00ZengMDPI AGApplied Sciences2076-34172022-06-011212599510.3390/app12125995Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic InformationJangseong Bae0Changki Lee1Language AI Lab, LG CNS, Seoul 07795, KoreaDepartment of Computer Science and Engineering, Kangwon National University, Chuncheon 24341, KoreaState-of-the-art semantic role labeling (SRL) performance has been achieved using neural network models by incorporating syntactic feature information such as dependency trees. In recent years, breakthroughs achieved using end-to-end neural network models have resulted in a state-of-the-art SRL performance even without syntactic features. With the advent of a language model called bidirectional encoder representations from transformers (BERT), another breakthrough was witnessed. Even though the semantic information of each word constituting a sentence is important in determining the meaning of a word, previous studies regarding the end-to-end neural network method did not utilize semantic information. In this study, we propose a BERT-based SRL model that uses simple semantic information without syntactic feature information. To obtain the latter, we used PropBank, which described the relational information between predicates and arguments. In addition, text-originated feature information obtained from the training text data was utilized. Our proposed model achieved state-of-the-art results on both Korean PropBank and CoNLL-2009 English benchmarks.https://www.mdpi.com/2076-3417/12/12/5995Korean semantic role labelingBERTsemantic informationtext-originated feature information |
spellingShingle | Jangseong Bae Changki Lee Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information Applied Sciences Korean semantic role labeling BERT semantic information text-originated feature information |
title | Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information |
title_full | Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information |
title_fullStr | Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information |
title_full_unstemmed | Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information |
title_short | Korean Semantic Role Labeling with Bidirectional Encoder Representations from Transformers and Simple Semantic Information |
title_sort | korean semantic role labeling with bidirectional encoder representations from transformers and simple semantic information |
topic | Korean semantic role labeling BERT semantic information text-originated feature information |
url | https://www.mdpi.com/2076-3417/12/12/5995 |
work_keys_str_mv | AT jangseongbae koreansemanticrolelabelingwithbidirectionalencoderrepresentationsfromtransformersandsimplesemanticinformation AT changkilee koreansemanticrolelabelingwithbidirectionalencoderrepresentationsfromtransformersandsimplesemanticinformation |