Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course

Prior research has established that students often underprepare for midterm examinations yet remain overconfident in their proficiency. Research concerning the testing effect has demonstrated that utilizing testing as a study strategy leads to higher performance and more accurate confidence compared...

Full description

Bibliographic Details
Main Authors: Jason W. Morphew, Jose P. Mestre, Hyeon-Ah Kang, Hua-Hua Chang, Gregory Fabry
Format: Article
Language:English
Published: American Physical Society 2018-09-01
Series:Physical Review Physics Education Research
Online Access:http://doi.org/10.1103/PhysRevPhysEducRes.14.020110
_version_ 1818896665542656000
author Jason W. Morphew
Jose P. Mestre
Hyeon-Ah Kang
Hua-Hua Chang
Gregory Fabry
author_facet Jason W. Morphew
Jose P. Mestre
Hyeon-Ah Kang
Hua-Hua Chang
Gregory Fabry
author_sort Jason W. Morphew
collection DOAJ
description Prior research has established that students often underprepare for midterm examinations yet remain overconfident in their proficiency. Research concerning the testing effect has demonstrated that utilizing testing as a study strategy leads to higher performance and more accurate confidence compared to more common study strategies such as rereading or reviewing homework problems. We report on three experiments that explore the viability of using computer adaptive testing (CAT) for assessing students’ physics proficiency, for preparing students for midterm exams by diagnosing their weaknesses, and for predicting scores in midterm exams in an introductory calculus-based mechanics course for science and engineering majors. The first two experiments evaluated the reliability and validity of the CAT algorithm. In addition, we investigated the ability of the CAT test to predict performance on the midterm exam. The third experiment explored whether completing two CAT tests in the days before a midterm exam would facilitate performance on the midterm exam. Scores on the CAT tests and the midterm exams were significantly correlated and, on average, were not statistically different from each other. This provides evidence for moderate parallel-forms reliability and criterion-related validity of the CAT algorithm. In addition, when used as a diagnostic tool, CAT showed promise in helping students perform better on midterm exams. Finally, we found that the CAT tests predicted the average performance on the midterm exams reasonably well, however, the CAT tests were not as accurate as desired at predicting the performance of individual students. While CAT shows promise for practice testing, more research is needed to refine testing algorithms to increase reliability before implementing CAT for summative evaluations. In light of these findings, we believe that more research is needed comparing CAT to traditional paper-and-pencil practice tests in order to determine whether the effort needed to create a CAT system is worthwhile.
first_indexed 2024-12-19T19:03:54Z
format Article
id doaj.art-45c3150d6e694d4780546006969a989e
institution Directory Open Access Journal
issn 2469-9896
language English
last_indexed 2024-12-19T19:03:54Z
publishDate 2018-09-01
publisher American Physical Society
record_format Article
series Physical Review Physics Education Research
spelling doaj.art-45c3150d6e694d4780546006969a989e2022-12-21T20:09:29ZengAmerican Physical SocietyPhysical Review Physics Education Research2469-98962018-09-0114202011010.1103/PhysRevPhysEducRes.14.020110Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics courseJason W. MorphewJose P. MestreHyeon-Ah KangHua-Hua ChangGregory FabryPrior research has established that students often underprepare for midterm examinations yet remain overconfident in their proficiency. Research concerning the testing effect has demonstrated that utilizing testing as a study strategy leads to higher performance and more accurate confidence compared to more common study strategies such as rereading or reviewing homework problems. We report on three experiments that explore the viability of using computer adaptive testing (CAT) for assessing students’ physics proficiency, for preparing students for midterm exams by diagnosing their weaknesses, and for predicting scores in midterm exams in an introductory calculus-based mechanics course for science and engineering majors. The first two experiments evaluated the reliability and validity of the CAT algorithm. In addition, we investigated the ability of the CAT test to predict performance on the midterm exam. The third experiment explored whether completing two CAT tests in the days before a midterm exam would facilitate performance on the midterm exam. Scores on the CAT tests and the midterm exams were significantly correlated and, on average, were not statistically different from each other. This provides evidence for moderate parallel-forms reliability and criterion-related validity of the CAT algorithm. In addition, when used as a diagnostic tool, CAT showed promise in helping students perform better on midterm exams. Finally, we found that the CAT tests predicted the average performance on the midterm exams reasonably well, however, the CAT tests were not as accurate as desired at predicting the performance of individual students. While CAT shows promise for practice testing, more research is needed to refine testing algorithms to increase reliability before implementing CAT for summative evaluations. In light of these findings, we believe that more research is needed comparing CAT to traditional paper-and-pencil practice tests in order to determine whether the effort needed to create a CAT system is worthwhile.http://doi.org/10.1103/PhysRevPhysEducRes.14.020110
spellingShingle Jason W. Morphew
Jose P. Mestre
Hyeon-Ah Kang
Hua-Hua Chang
Gregory Fabry
Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
Physical Review Physics Education Research
title Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
title_full Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
title_fullStr Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
title_full_unstemmed Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
title_short Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
title_sort using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course
url http://doi.org/10.1103/PhysRevPhysEducRes.14.020110
work_keys_str_mv AT jasonwmorphew usingcomputeradaptivetestingtoassessphysicsproficiencyandimproveexamperformanceinanintroductoryphysicscourse
AT josepmestre usingcomputeradaptivetestingtoassessphysicsproficiencyandimproveexamperformanceinanintroductoryphysicscourse
AT hyeonahkang usingcomputeradaptivetestingtoassessphysicsproficiencyandimproveexamperformanceinanintroductoryphysicscourse
AT huahuachang usingcomputeradaptivetestingtoassessphysicsproficiencyandimproveexamperformanceinanintroductoryphysicscourse
AT gregoryfabry usingcomputeradaptivetestingtoassessphysicsproficiencyandimproveexamperformanceinanintroductoryphysicscourse