An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency

The analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response late...

Full description

Bibliographic Details
Main Authors: Ioannis Tsaousis, Georgios D. Sideridis, Abdullah Al-Sadaawi
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-11-01
Series:Frontiers in Psychology
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fpsyg.2018.02177/full
_version_ 1811315499207753728
author Ioannis Tsaousis
Georgios D. Sideridis
Georgios D. Sideridis
Abdullah Al-Sadaawi
Abdullah Al-Sadaawi
author_facet Ioannis Tsaousis
Georgios D. Sideridis
Georgios D. Sideridis
Abdullah Al-Sadaawi
Abdullah Al-Sadaawi
author_sort Ioannis Tsaousis
collection DOAJ
description The analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response latency affects person's ability parameters, in that it represents an adaptive or maladaptive practice. To examine the above research question data from 8,475 individuals completing the computerized version of the Postgraduate General Aptitude Test (PAGAT) were analyzed. To determine the extent to which response latency affects person's ability, we used a Multiple Indicators Multiple Causes (MIMIC) model, in which every item in a scale was linked to its corresponding covariate (i.e., item response latency). We ran the MIMIC model within the Item Response Theory (IRT) framework (2-PL model). The results supported the hypothesis that item response latency could provide valuable information for getting more accurate estimations for persons' ability levels. Results indicated that for individuals who invest more time on easy items, their likelihood of success does not improve, most likely because slow and fast responders have significantly different levels of ability (fast responders are of higher ability compared to slow responders). Consequently, investing more time for low ability individuals does not prove to be adaptive. The opposite was found for difficult items: individuals spending more time on difficult items increase their likelihood of success, more likely because they are high achievers (in difficult items individuals who spent more time were of significantly higher ability compared to fast responders). Thus, it appears that there is an interaction between the difficulty of the item and person abilities that explain the effects of response time on likelihood of success. We concluded that accommodating item response latency in a computerized assessment model, can inform test quality and test takers' behavior, and in that way, enhance score measurement accuracy.
first_indexed 2024-04-13T11:31:20Z
format Article
id doaj.art-ee892f4df87c4a348544e9b7494f70f4
institution Directory Open Access Journal
issn 1664-1078
language English
last_indexed 2024-04-13T11:31:20Z
publishDate 2018-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Psychology
spelling doaj.art-ee892f4df87c4a348544e9b7494f70f42022-12-22T02:48:34ZengFrontiers Media S.A.Frontiers in Psychology1664-10782018-11-01910.3389/fpsyg.2018.02177375304An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response LatencyIoannis Tsaousis0Georgios D. Sideridis1Georgios D. Sideridis2Abdullah Al-Sadaawi3Abdullah Al-Sadaawi4Department of Psychology, University of Crete, Rethymno, GreeceInstitutional Centers for Clinical and Translational Research, Boston Children's Hospital, Harvard Medical School, Boston, MA, United StatesDepartment of Primary Education, National and Kapodistrian University of Athens, Athens, GreecePsychology Department, College of Education, King Saud University, Riyadh, Saudi ArabiaNational Center for Assessment, Riyadh, Saudi ArabiaThe analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response latency affects person's ability parameters, in that it represents an adaptive or maladaptive practice. To examine the above research question data from 8,475 individuals completing the computerized version of the Postgraduate General Aptitude Test (PAGAT) were analyzed. To determine the extent to which response latency affects person's ability, we used a Multiple Indicators Multiple Causes (MIMIC) model, in which every item in a scale was linked to its corresponding covariate (i.e., item response latency). We ran the MIMIC model within the Item Response Theory (IRT) framework (2-PL model). The results supported the hypothesis that item response latency could provide valuable information for getting more accurate estimations for persons' ability levels. Results indicated that for individuals who invest more time on easy items, their likelihood of success does not improve, most likely because slow and fast responders have significantly different levels of ability (fast responders are of higher ability compared to slow responders). Consequently, investing more time for low ability individuals does not prove to be adaptive. The opposite was found for difficult items: individuals spending more time on difficult items increase their likelihood of success, more likely because they are high achievers (in difficult items individuals who spent more time were of significantly higher ability compared to fast responders). Thus, it appears that there is an interaction between the difficulty of the item and person abilities that explain the effects of response time on likelihood of success. We concluded that accommodating item response latency in a computerized assessment model, can inform test quality and test takers' behavior, and in that way, enhance score measurement accuracy.https://www.frontiersin.org/article/10.3389/fpsyg.2018.02177/fullitem response latencycomputer based testing (CBT)educational testingmultiple indicator multiple causes model (MIMIC)IRT-MIMIC
spellingShingle Ioannis Tsaousis
Georgios D. Sideridis
Georgios D. Sideridis
Abdullah Al-Sadaawi
Abdullah Al-Sadaawi
An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
Frontiers in Psychology
item response latency
computer based testing (CBT)
educational testing
multiple indicator multiple causes model (MIMIC)
IRT-MIMIC
title An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_full An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_fullStr An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_full_unstemmed An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_short An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_sort irt multiple indicators multiple causes mimic approach as a method of examining item response latency
topic item response latency
computer based testing (CBT)
educational testing
multiple indicator multiple causes model (MIMIC)
IRT-MIMIC
url https://www.frontiersin.org/article/10.3389/fpsyg.2018.02177/full
work_keys_str_mv AT ioannistsaousis anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT georgiosdsideridis anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT georgiosdsideridis anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT abdullahalsadaawi anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT abdullahalsadaawi anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT ioannistsaousis irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT georgiosdsideridis irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT georgiosdsideridis irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT abdullahalsadaawi irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT abdullahalsadaawi irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency