Multidimensional item response theory and the Brief Electricity and Magnetism Assessment

This work is the fourth of a series of papers applying multidimensional item response theory (MIRT) to widely used physics conceptual assessments. This study applies MIRT analysis using both exploratory and confirmatory methods to the Brief Electricity and Magnetism Assessment (BEMA) to explore the...

Full description

Bibliographic Details
Main Authors: John Hansen, John Stewart
Format: Article
Language:English
Published: American Physical Society 2021-11-01
Series:Physical Review Physics Education Research
Online Access:http://doi.org/10.1103/PhysRevPhysEducRes.17.020139
_version_ 1818842186000629760
author John Hansen
John Stewart
author_facet John Hansen
John Stewart
author_sort John Hansen
collection DOAJ
description This work is the fourth of a series of papers applying multidimensional item response theory (MIRT) to widely used physics conceptual assessments. This study applies MIRT analysis using both exploratory and confirmatory methods to the Brief Electricity and Magnetism Assessment (BEMA) to explore the assessment’s structure and to determine a well-fitting model of student knowledge measured by the assessment. These methods were used to investigate a large dataset (N=9666) from a research university in the United States. Exploratory analysis showed that a five-factor model had the best fit statistics; the items with the highest loadings in four of the five factors were items in the same item block. Confirmatory MIRT analysis fit a theoretical model developed from expert solutions to the instrument and identified two models with superior model fit: a principle model and a topical model. The principle model consisted of 28 principles, fundamental reasoning steps needed to solve items in the instrument; this was more principles than any of the models in the previous confirmatory MIRT studies of the Force Concept Inventory, the Force and Motion Conceptual Evaluation, and the Conceptual Survey of Electricity and Magnetism. A second model, the topical model, consisted of five general subtopics of electromagnetism. Both the principle and the topical model had excellent fit statistics; however, unlike the other conceptual instruments studied, the topical model had better fit statistics. The five topical divisions were explored as possible subscales; however, none of these subscales had a Cronbach’s α of 0.7, the minimum value for required low-stakes testing.
first_indexed 2024-12-19T04:37:58Z
format Article
id doaj.art-5f64b9515f294b15a74710cbb7bf84e3
institution Directory Open Access Journal
issn 2469-9896
language English
last_indexed 2024-12-19T04:37:58Z
publishDate 2021-11-01
publisher American Physical Society
record_format Article
series Physical Review Physics Education Research
spelling doaj.art-5f64b9515f294b15a74710cbb7bf84e32022-12-21T20:35:42ZengAmerican Physical SocietyPhysical Review Physics Education Research2469-98962021-11-0117202013910.1103/PhysRevPhysEducRes.17.020139Multidimensional item response theory and the Brief Electricity and Magnetism AssessmentJohn HansenJohn StewartThis work is the fourth of a series of papers applying multidimensional item response theory (MIRT) to widely used physics conceptual assessments. This study applies MIRT analysis using both exploratory and confirmatory methods to the Brief Electricity and Magnetism Assessment (BEMA) to explore the assessment’s structure and to determine a well-fitting model of student knowledge measured by the assessment. These methods were used to investigate a large dataset (N=9666) from a research university in the United States. Exploratory analysis showed that a five-factor model had the best fit statistics; the items with the highest loadings in four of the five factors were items in the same item block. Confirmatory MIRT analysis fit a theoretical model developed from expert solutions to the instrument and identified two models with superior model fit: a principle model and a topical model. The principle model consisted of 28 principles, fundamental reasoning steps needed to solve items in the instrument; this was more principles than any of the models in the previous confirmatory MIRT studies of the Force Concept Inventory, the Force and Motion Conceptual Evaluation, and the Conceptual Survey of Electricity and Magnetism. A second model, the topical model, consisted of five general subtopics of electromagnetism. Both the principle and the topical model had excellent fit statistics; however, unlike the other conceptual instruments studied, the topical model had better fit statistics. The five topical divisions were explored as possible subscales; however, none of these subscales had a Cronbach’s α of 0.7, the minimum value for required low-stakes testing.http://doi.org/10.1103/PhysRevPhysEducRes.17.020139
spellingShingle John Hansen
John Stewart
Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
Physical Review Physics Education Research
title Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
title_full Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
title_fullStr Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
title_full_unstemmed Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
title_short Multidimensional item response theory and the Brief Electricity and Magnetism Assessment
title_sort multidimensional item response theory and the brief electricity and magnetism assessment
url http://doi.org/10.1103/PhysRevPhysEducRes.17.020139
work_keys_str_mv AT johnhansen multidimensionalitemresponsetheoryandthebriefelectricityandmagnetismassessment
AT johnstewart multidimensionalitemresponsetheoryandthebriefelectricityandmagnetismassessment