Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network
Automated scoring systems have been revolutionized by natural language processing, enabling the evaluation of students’ diverse answers across various academic disciplines. However, this presents a challenge as students’ responses may vary significantly in terms of length, structure, and content. To...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-06-01
|
Series: | Big Data and Cognitive Computing |
Subjects: | |
Online Access: | https://www.mdpi.com/2504-2289/7/3/122 |
_version_ | 1827727211004690432 |
---|---|
author | Wael H. Gomaa Abdelrahman E. Nagib Mostafa M. Saeed Abdulmohsen Algarni Emad Nabil |
author_facet | Wael H. Gomaa Abdelrahman E. Nagib Mostafa M. Saeed Abdulmohsen Algarni Emad Nabil |
author_sort | Wael H. Gomaa |
collection | DOAJ |
description | Automated scoring systems have been revolutionized by natural language processing, enabling the evaluation of students’ diverse answers across various academic disciplines. However, this presents a challenge as students’ responses may vary significantly in terms of length, structure, and content. To tackle this challenge, this research introduces a novel automated model for short answer grading. The proposed model uses pretrained “transformer” models, specifically T5, in conjunction with a BI-LSTM architecture which is effective in processing sequential data by considering the past and future context. This research evaluated several preprocessing techniques and different hyperparameters to identify the most efficient architecture. Experiments were conducted using a standard benchmark dataset named the North Texas Dataset. This research achieved a state-of-the-art correlation value of 92.5 percent. The proposed model’s accuracy has significant implications for education as it has the potential to save educators considerable time and effort, while providing a reliable and fair evaluation for students, ultimately leading to improved learning outcomes. |
first_indexed | 2024-03-10T23:02:15Z |
format | Article |
id | doaj.art-394ca4faa51e4edcbd67b793da6ba7ce |
institution | Directory Open Access Journal |
issn | 2504-2289 |
language | English |
last_indexed | 2024-03-10T23:02:15Z |
publishDate | 2023-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Big Data and Cognitive Computing |
spelling | doaj.art-394ca4faa51e4edcbd67b793da6ba7ce2023-11-19T09:34:03ZengMDPI AGBig Data and Cognitive Computing2504-22892023-06-017312210.3390/bdcc7030122Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM NetworkWael H. Gomaa0Abdelrahman E. Nagib1Mostafa M. Saeed2Abdulmohsen Algarni3Emad Nabil4Faculty of Computers and Artificial Intelligence, Beni-Suef University, Beni Suef 62511, EgyptFaculty of Computer Science, 6th of October Campus, MSA University, Giza 12566, EgyptFaculty of Computer Science, 6th of October Campus, MSA University, Giza 12566, EgyptFaculty of Computer Science, King Khalid University, Abha 61421, Saudi ArabiaFaculty of Computer and Information Systems, Islamic University of Madinah, Madinah 42351, Saudi ArabiaAutomated scoring systems have been revolutionized by natural language processing, enabling the evaluation of students’ diverse answers across various academic disciplines. However, this presents a challenge as students’ responses may vary significantly in terms of length, structure, and content. To tackle this challenge, this research introduces a novel automated model for short answer grading. The proposed model uses pretrained “transformer” models, specifically T5, in conjunction with a BI-LSTM architecture which is effective in processing sequential data by considering the past and future context. This research evaluated several preprocessing techniques and different hyperparameters to identify the most efficient architecture. Experiments were conducted using a standard benchmark dataset named the North Texas Dataset. This research achieved a state-of-the-art correlation value of 92.5 percent. The proposed model’s accuracy has significant implications for education as it has the potential to save educators considerable time and effort, while providing a reliable and fair evaluation for students, ultimately leading to improved learning outcomes.https://www.mdpi.com/2504-2289/7/3/122automatic scoringshort answer gradingtransformersdeep learningAI in education |
spellingShingle | Wael H. Gomaa Abdelrahman E. Nagib Mostafa M. Saeed Abdulmohsen Algarni Emad Nabil Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network Big Data and Cognitive Computing automatic scoring short answer grading transformers deep learning AI in education |
title | Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network |
title_full | Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network |
title_fullStr | Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network |
title_full_unstemmed | Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network |
title_short | Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network |
title_sort | empowering short answer grading integrating transformer based embeddings and bi lstm network |
topic | automatic scoring short answer grading transformers deep learning AI in education |
url | https://www.mdpi.com/2504-2289/7/3/122 |
work_keys_str_mv | AT waelhgomaa empoweringshortanswergradingintegratingtransformerbasedembeddingsandbilstmnetwork AT abdelrahmanenagib empoweringshortanswergradingintegratingtransformerbasedembeddingsandbilstmnetwork AT mostafamsaeed empoweringshortanswergradingintegratingtransformerbasedembeddingsandbilstmnetwork AT abdulmohsenalgarni empoweringshortanswergradingintegratingtransformerbasedembeddingsandbilstmnetwork AT emadnabil empoweringshortanswergradingintegratingtransformerbasedembeddingsandbilstmnetwork |