Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures
The assessment of answers is an important process that requires great effort from evaluators. This assessment process requires high concentration without any fluctuations in mood. This substantiates the need to automate answer script evaluation. Regarding text answer evaluation, sentence similarity...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
De Gruyter
2017-04-01
|
Series: | Journal of Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1515/jisys-2015-0031 |
_version_ | 1818716614138265600 |
---|---|
author | Ramamurthy Madhumitha Krishnamurthi Ilango |
author_facet | Ramamurthy Madhumitha Krishnamurthi Ilango |
author_sort | Ramamurthy Madhumitha |
collection | DOAJ |
description | The assessment of answers is an important process that requires great effort from evaluators. This assessment process requires high concentration without any fluctuations in mood. This substantiates the need to automate answer script evaluation. Regarding text answer evaluation, sentence similarity measures have been widely used to compare student written answers with reference texts. In this paper, we propose an automated answer evaluation system that uses our proposed cosine-based sentence similarity measures to evaluate the answers. Cosine measures have proved to be effective in comparing between free text student answers and reference texts. Here we propose a set of novel cosine-based sentence similarity measures with varied approaches of creating document vector space. In addition to this, we propose a novel synset-based word similarity measure for computation of document vectors coupled with varied approaches for dimensionality-reduction for reducing vector space dimensions. Thus, we propose 21 cosine-based sentence similarity measures and measured their performance using MSR paraphrase corpus and Li’s benchmark datasets. We also use these measures for automatic answer evaluation system and compare their performances using the Kaggle short answer and essay dataset. The performance of the system-generated scores is compared with the human scores using Pearson correlation. The results show that system and human scores have correlation between each other. |
first_indexed | 2024-12-17T19:22:03Z |
format | Article |
id | doaj.art-124a05d03397473092c417c85c44008f |
institution | Directory Open Access Journal |
issn | 0334-1860 2191-026X |
language | English |
last_indexed | 2024-12-17T19:22:03Z |
publishDate | 2017-04-01 |
publisher | De Gruyter |
record_format | Article |
series | Journal of Intelligent Systems |
spelling | doaj.art-124a05d03397473092c417c85c44008f2022-12-21T21:35:29ZengDe GruyterJournal of Intelligent Systems0334-18602191-026X2017-04-0126224326210.1515/jisys-2015-0031Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity MeasuresRamamurthy Madhumitha0Krishnamurthi Ilango1Department of Computer Science and Engineering, Sri Krishna College of Engineering and Technology, Coimbatore 641008, Tamil Nadu, IndiaDepartment of Computer Science and Engineering, Sri Krishna College of Engineering and Technology, Coimbatore 641008, Tamil Nadu, IndiaThe assessment of answers is an important process that requires great effort from evaluators. This assessment process requires high concentration without any fluctuations in mood. This substantiates the need to automate answer script evaluation. Regarding text answer evaluation, sentence similarity measures have been widely used to compare student written answers with reference texts. In this paper, we propose an automated answer evaluation system that uses our proposed cosine-based sentence similarity measures to evaluate the answers. Cosine measures have proved to be effective in comparing between free text student answers and reference texts. Here we propose a set of novel cosine-based sentence similarity measures with varied approaches of creating document vector space. In addition to this, we propose a novel synset-based word similarity measure for computation of document vectors coupled with varied approaches for dimensionality-reduction for reducing vector space dimensions. Thus, we propose 21 cosine-based sentence similarity measures and measured their performance using MSR paraphrase corpus and Li’s benchmark datasets. We also use these measures for automatic answer evaluation system and compare their performances using the Kaggle short answer and essay dataset. The performance of the system-generated scores is compared with the human scores using Pearson correlation. The results show that system and human scores have correlation between each other.https://doi.org/10.1515/jisys-2015-0031assessmentautomatic answer evaluation systemcosine similaritysimilarity measuresentence similarity |
spellingShingle | Ramamurthy Madhumitha Krishnamurthi Ilango Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures Journal of Intelligent Systems assessment automatic answer evaluation system cosine similarity similarity measure sentence similarity |
title | Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures |
title_full | Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures |
title_fullStr | Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures |
title_full_unstemmed | Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures |
title_short | Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures |
title_sort | design and development of a framework for an automatic answer evaluation system based on similarity measures |
topic | assessment automatic answer evaluation system cosine similarity similarity measure sentence similarity |
url | https://doi.org/10.1515/jisys-2015-0031 |
work_keys_str_mv | AT ramamurthymadhumitha designanddevelopmentofaframeworkforanautomaticanswerevaluationsystembasedonsimilaritymeasures AT krishnamurthiilango designanddevelopmentofaframeworkforanautomaticanswerevaluationsystembasedonsimilaritymeasures |