Test Data Generation Framework for Automatic Programming Assessment

Automatic Programming Assessment (APA) has recently become a significant method in assisting educators of programming courses to automatically mark and grade students’ programming as its counterpart; the typical manual tasks are prone to errors and leading to inconsistency. By default, test data gen...

Full description

Bibliographic Details
Main Authors: Rohaida, Romli, Shahida, Sulaiman, Kamal Z., Zamli
Format: Conference or Workshop Item
Language:English
Published: 2014
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/8398/1/Test_Data_Generation_Framework_for_Automatic_Programming_Assessment.pdf
_version_ 1825822098858180608
author Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
author_facet Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
author_sort Rohaida, Romli
collection UMP
description Automatic Programming Assessment (APA) has recently become a significant method in assisting educators of programming courses to automatically mark and grade students’ programming as its counterpart; the typical manual tasks are prone to errors and leading to inconsistency. By default, test data generation process plays an important role to perform a dynamic testing on students’ programs. In software testing field, there have been diverse automated methods for test data generation. Unfortunately, APA seldom adopts these methods. Merely limited studies have attempted to integrate APA and test data generation to include more useful features and to provide a precise and thorough quality of program testing. Thus, we propose a framework of test data generation so-called FaSt-Gen to cover both the functional and structural testing of a program for APA. It aims to assist the lecturers of programming courses to furnish an adequate set of test data to assess students’ programming solutions regardless of having the concrete expertise in the particular knowledge of test cases design. FaStGen integrates the positive and negative testing criteria (or reliable and valid test adequacy criteria) to derive desired test data and test set schema. The findings from the conducted experiment depict that FaSt-Gen improves the reliability and validity test data adequacy in programming assessments.
first_indexed 2024-03-06T11:51:30Z
format Conference or Workshop Item
id UMPir8398
institution Universiti Malaysia Pahang
language English
last_indexed 2024-03-06T11:51:30Z
publishDate 2014
record_format dspace
spelling UMPir83982018-01-16T02:22:22Z http://umpir.ump.edu.my/id/eprint/8398/ Test Data Generation Framework for Automatic Programming Assessment Rohaida, Romli Shahida, Sulaiman Kamal Z., Zamli QA76 Computer software Automatic Programming Assessment (APA) has recently become a significant method in assisting educators of programming courses to automatically mark and grade students’ programming as its counterpart; the typical manual tasks are prone to errors and leading to inconsistency. By default, test data generation process plays an important role to perform a dynamic testing on students’ programs. In software testing field, there have been diverse automated methods for test data generation. Unfortunately, APA seldom adopts these methods. Merely limited studies have attempted to integrate APA and test data generation to include more useful features and to provide a precise and thorough quality of program testing. Thus, we propose a framework of test data generation so-called FaSt-Gen to cover both the functional and structural testing of a program for APA. It aims to assist the lecturers of programming courses to furnish an adequate set of test data to assess students’ programming solutions regardless of having the concrete expertise in the particular knowledge of test cases design. FaStGen integrates the positive and negative testing criteria (or reliable and valid test adequacy criteria) to derive desired test data and test set schema. The findings from the conducted experiment depict that FaSt-Gen improves the reliability and validity test data adequacy in programming assessments. 2014-09-23 Conference or Workshop Item PeerReviewed application/pdf en http://umpir.ump.edu.my/id/eprint/8398/1/Test_Data_Generation_Framework_for_Automatic_Programming_Assessment.pdf Rohaida, Romli and Shahida, Sulaiman and Kamal Z., Zamli (2014) Test Data Generation Framework for Automatic Programming Assessment. In: 8th Malaysian Software Engineering Conference (MySEC) , 23-24 September 2014 , Langkawi, Kedah. pp. 84-89.. (Published) http://dx.doi.org/10.1109/MySec.2014.6985993
spellingShingle QA76 Computer software
Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
Test Data Generation Framework for Automatic Programming Assessment
title Test Data Generation Framework for Automatic Programming Assessment
title_full Test Data Generation Framework for Automatic Programming Assessment
title_fullStr Test Data Generation Framework for Automatic Programming Assessment
title_full_unstemmed Test Data Generation Framework for Automatic Programming Assessment
title_short Test Data Generation Framework for Automatic Programming Assessment
title_sort test data generation framework for automatic programming assessment
topic QA76 Computer software
url http://umpir.ump.edu.my/id/eprint/8398/1/Test_Data_Generation_Framework_for_Automatic_Programming_Assessment.pdf
work_keys_str_mv AT rohaidaromli testdatagenerationframeworkforautomaticprogrammingassessment
AT shahidasulaiman testdatagenerationframeworkforautomaticprogrammingassessment
AT kamalzzamli testdatagenerationframeworkforautomaticprogrammingassessment