Examining evolving performance on the Force Concept Inventory using factor analysis

The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous re...

Full description

Bibliographic Details
Main Authors: M. R. Semak, R. D. Dietz, R. H. Pearson, C. W. Willis
Format: Article
Language:English
Published: American Physical Society 2017-01-01
Series:Physical Review Physics Education Research
Online Access:http://doi.org/10.1103/PhysRevPhysEducRes.13.010103
_version_ 1818739162498465792
author M. R. Semak
R. D. Dietz
R. H. Pearson
C. W. Willis
author_facet M. R. Semak
R. D. Dietz
R. H. Pearson
C. W. Willis
author_sort M. R. Semak
collection DOAJ
description The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert’s interpretation of the questions’ content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.
first_indexed 2024-12-18T01:20:27Z
format Article
id doaj.art-8a804bd8f5e446e9b8902e6f8c85aebe
institution Directory Open Access Journal
issn 2469-9896
language English
last_indexed 2024-12-18T01:20:27Z
publishDate 2017-01-01
publisher American Physical Society
record_format Article
series Physical Review Physics Education Research
spelling doaj.art-8a804bd8f5e446e9b8902e6f8c85aebe2022-12-21T21:25:51ZengAmerican Physical SocietyPhysical Review Physics Education Research2469-98962017-01-0113101010310.1103/PhysRevPhysEducRes.13.010103Examining evolving performance on the Force Concept Inventory using factor analysisM. R. SemakR. D. DietzR. H. PearsonC. W. WillisThe application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert’s interpretation of the questions’ content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.http://doi.org/10.1103/PhysRevPhysEducRes.13.010103
spellingShingle M. R. Semak
R. D. Dietz
R. H. Pearson
C. W. Willis
Examining evolving performance on the Force Concept Inventory using factor analysis
Physical Review Physics Education Research
title Examining evolving performance on the Force Concept Inventory using factor analysis
title_full Examining evolving performance on the Force Concept Inventory using factor analysis
title_fullStr Examining evolving performance on the Force Concept Inventory using factor analysis
title_full_unstemmed Examining evolving performance on the Force Concept Inventory using factor analysis
title_short Examining evolving performance on the Force Concept Inventory using factor analysis
title_sort examining evolving performance on the force concept inventory using factor analysis
url http://doi.org/10.1103/PhysRevPhysEducRes.13.010103
work_keys_str_mv AT mrsemak examiningevolvingperformanceontheforceconceptinventoryusingfactoranalysis
AT rddietz examiningevolvingperformanceontheforceconceptinventoryusingfactoranalysis
AT rhpearson examiningevolvingperformanceontheforceconceptinventoryusingfactoranalysis
AT cwwillis examiningevolvingperformanceontheforceconceptinventoryusingfactoranalysis