Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading

As automated essay scoring (AES) has progressed from handcrafted techniques to deep learning, holistic scoring capabilities have merged. However, specific trait assessment remains a challenge because of the limited depth of earlier methods in modeling dual assessments for holistic and multi-trait ta...

Full description

Bibliographic Details
Main Authors: Minsoo Cho, Jin-Xia Huang, Oh-Woog Kwon
Format: Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 2024-02-01
Series:ETRI Journal
Subjects:
Online Access:https://doi.org/10.4218/etrij.2023-0324
_version_ 1797266503755628544
author Minsoo Cho
Jin-Xia Huang
Oh-Woog Kwon
author_facet Minsoo Cho
Jin-Xia Huang
Oh-Woog Kwon
author_sort Minsoo Cho
collection DOAJ
description As automated essay scoring (AES) has progressed from handcrafted techniques to deep learning, holistic scoring capabilities have merged. However, specific trait assessment remains a challenge because of the limited depth of earlier methods in modeling dual assessments for holistic and multi-trait tasks. To overcome this challenge, we explore providing comprehensive feedback while modeling the interconnections between holistic and trait representations. We introduce the DualBERT-Trans-CNN model, which combines transformerbased representations with a novel dual-scale bidirectional encoder representations from transformers (BERT) encoding approach at the document-level. By explicitly leveraging multi-trait representations in a multi-task learning (MTL) framework, our DualBERT-Trans-CNN emphasizes the interrelation between holistic and trait-based score predictions, aiming for improved accuracy. For validation, we conducted extensive tests on the ASAP++ and TOEFL11 datasets. Against models of the same MTL setting, ours showed a 2.0% increase in its holistic score. Additionally, compared with single-task learning (STL) models, ours demonstrated a 3.6% enhancement in average multi-trait performance on the ASAP++ dataset.
first_indexed 2024-04-25T01:01:44Z
format Article
id doaj.art-80b3fe8b3df04735a6fac1bc8e3239c2
institution Directory Open Access Journal
issn 1225-6463
2233-7326
language English
last_indexed 2024-04-25T01:01:44Z
publishDate 2024-02-01
publisher Electronics and Telecommunications Research Institute (ETRI)
record_format Article
series ETRI Journal
spelling doaj.art-80b3fe8b3df04735a6fac1bc8e3239c22024-03-11T02:47:04ZengElectronics and Telecommunications Research Institute (ETRI)ETRI Journal1225-64632233-73262024-02-01461829510.4218/etrij.2023-0324Dual-scale BERT using multi-trait representations for holistic and trait-specific essay gradingMinsoo ChoJin-Xia HuangOh-Woog KwonAs automated essay scoring (AES) has progressed from handcrafted techniques to deep learning, holistic scoring capabilities have merged. However, specific trait assessment remains a challenge because of the limited depth of earlier methods in modeling dual assessments for holistic and multi-trait tasks. To overcome this challenge, we explore providing comprehensive feedback while modeling the interconnections between holistic and trait representations. We introduce the DualBERT-Trans-CNN model, which combines transformerbased representations with a novel dual-scale bidirectional encoder representations from transformers (BERT) encoding approach at the document-level. By explicitly leveraging multi-trait representations in a multi-task learning (MTL) framework, our DualBERT-Trans-CNN emphasizes the interrelation between holistic and trait-based score predictions, aiming for improved accuracy. For validation, we conducted extensive tests on the ASAP++ and TOEFL11 datasets. Against models of the same MTL setting, ours showed a 2.0% increase in its holistic score. Additionally, compared with single-task learning (STL) models, ours demonstrated a 3.6% enhancement in average multi-trait performance on the ASAP++ dataset.https://doi.org/10.4218/etrij.2023-0324automated essay scoringdeep learning methodsmulti-task learningmulti-trait scoringtransformer-based models
spellingShingle Minsoo Cho
Jin-Xia Huang
Oh-Woog Kwon
Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
ETRI Journal
automated essay scoring
deep learning methods
multi-task learning
multi-trait scoring
transformer-based models
title Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
title_full Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
title_fullStr Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
title_full_unstemmed Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
title_short Dual-scale BERT using multi-trait representations for holistic and trait-specific essay grading
title_sort dual scale bert using multi trait representations for holistic and trait specific essay grading
topic automated essay scoring
deep learning methods
multi-task learning
multi-trait scoring
transformer-based models
url https://doi.org/10.4218/etrij.2023-0324
work_keys_str_mv AT minsoocho dualscalebertusingmultitraitrepresentationsforholisticandtraitspecificessaygrading
AT jinxiahuang dualscalebertusingmultitraitrepresentationsforholisticandtraitspecificessaygrading
AT ohwoogkwon dualscalebertusingmultitraitrepresentationsforholisticandtraitspecificessaygrading