Language effects in international testing: the case of PISA 2006 science items

We investigate the extent to which language versions (English, French and Arabic) of the same science test are comparable in terms of item difficulty and demands. We argue that language is an inextricable part of the scientific literacy construct, be it intended or not by the examiner. This argument...

Full description

Bibliographic Details
Main Authors: El Masri, Y, Baird, J, Graesser, A
Format: Journal article
Language:English
Published: Routledge 2016
_version_ 1797056954395262976
author El Masri, Y
Baird, J
Graesser, A
author_facet El Masri, Y
Baird, J
Graesser, A
author_sort El Masri, Y
collection OXFORD
description We investigate the extent to which language versions (English, French and Arabic) of the same science test are comparable in terms of item difficulty and demands. We argue that language is an inextricable part of the scientific literacy construct, be it intended or not by the examiner. This argument has considerable implications on methodologies used to address the equivalence of multiple language versions of the same assessment, including in the context of international assessment where cross-cultural fairness is a concern. We also argue that none of the available statistical or qualitative techniques are capable of teasing out the language variable and neutralising its potential effects on item difficulty and demands. Exploring the use of automated text analysis tools at the quality control stage may be successful in addressing some of these challenges.
first_indexed 2024-03-06T19:29:46Z
format Journal article
id oxford-uuid:1d110345-46ce-4408-9c46-2af139319578
institution University of Oxford
language English
last_indexed 2024-03-06T19:29:46Z
publishDate 2016
publisher Routledge
record_format dspace
spelling oxford-uuid:1d110345-46ce-4408-9c46-2af1393195782022-03-26T11:08:50ZLanguage effects in international testing: the case of PISA 2006 science itemsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:1d110345-46ce-4408-9c46-2af139319578EnglishSymplectic Elements at OxfordRoutledge2016El Masri, YBaird, JGraesser, AWe investigate the extent to which language versions (English, French and Arabic) of the same science test are comparable in terms of item difficulty and demands. We argue that language is an inextricable part of the scientific literacy construct, be it intended or not by the examiner. This argument has considerable implications on methodologies used to address the equivalence of multiple language versions of the same assessment, including in the context of international assessment where cross-cultural fairness is a concern. We also argue that none of the available statistical or qualitative techniques are capable of teasing out the language variable and neutralising its potential effects on item difficulty and demands. Exploring the use of automated text analysis tools at the quality control stage may be successful in addressing some of these challenges.
spellingShingle El Masri, Y
Baird, J
Graesser, A
Language effects in international testing: the case of PISA 2006 science items
title Language effects in international testing: the case of PISA 2006 science items
title_full Language effects in international testing: the case of PISA 2006 science items
title_fullStr Language effects in international testing: the case of PISA 2006 science items
title_full_unstemmed Language effects in international testing: the case of PISA 2006 science items
title_short Language effects in international testing: the case of PISA 2006 science items
title_sort language effects in international testing the case of pisa 2006 science items
work_keys_str_mv AT elmasriy languageeffectsininternationaltestingthecaseofpisa2006scienceitems
AT bairdj languageeffectsininternationaltestingthecaseofpisa2006scienceitems
AT graessera languageeffectsininternationaltestingthecaseofpisa2006scienceitems