Improving automated programming assessments: user experience evaluation using fast-generator

Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed...

Full description

Bibliographic Details
Main Authors: Romli, Rohaida, Sulaiman, Shahida, Zamli, Kamal Zuhairi
Format: Conference or Workshop Item
Published: 2015
Subjects:
_version_ 1796861024592199680
author Romli, Rohaida
Sulaiman, Shahida
Zamli, Kamal Zuhairi
author_facet Romli, Rohaida
Sulaiman, Shahida
Zamli, Kamal Zuhairi
author_sort Romli, Rohaida
collection ePrints
description Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS).
first_indexed 2024-03-05T19:50:07Z
format Conference or Workshop Item
id utm.eprints-61040
institution Universiti Teknologi Malaysia - ePrints
last_indexed 2024-03-05T19:50:07Z
publishDate 2015
record_format dspace
spelling utm.eprints-610402017-08-21T07:12:42Z http://eprints.utm.my/61040/ Improving automated programming assessments: user experience evaluation using fast-generator Romli, Rohaida Sulaiman, Shahida Zamli, Kamal Zuhairi QA75 Electronic computers. Computer science Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS). 2015 Conference or Workshop Item PeerReviewed Romli, Rohaida and Sulaiman, Shahida and Zamli, Kamal Zuhairi (2015) Improving automated programming assessments: user experience evaluation using fast-generator. In: The Third Information Systems International Conference (ISICO 2015), 2-4 Nov, 2011, Indonesia. http://www.isico.info/
spellingShingle QA75 Electronic computers. Computer science
Romli, Rohaida
Sulaiman, Shahida
Zamli, Kamal Zuhairi
Improving automated programming assessments: user experience evaluation using fast-generator
title Improving automated programming assessments: user experience evaluation using fast-generator
title_full Improving automated programming assessments: user experience evaluation using fast-generator
title_fullStr Improving automated programming assessments: user experience evaluation using fast-generator
title_full_unstemmed Improving automated programming assessments: user experience evaluation using fast-generator
title_short Improving automated programming assessments: user experience evaluation using fast-generator
title_sort improving automated programming assessments user experience evaluation using fast generator
topic QA75 Electronic computers. Computer science
work_keys_str_mv AT romlirohaida improvingautomatedprogrammingassessmentsuserexperienceevaluationusingfastgenerator
AT sulaimanshahida improvingautomatedprogrammingassessmentsuserexperienceevaluationusingfastgenerator
AT zamlikamalzuhairi improvingautomatedprogrammingassessmentsuserexperienceevaluationusingfastgenerator