Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills

Citizen science (CS) projects engage citizens for research purposes and promote individual learning outcomes such as scientific reasoning (SR) skills. SR refers to participants’ skills to solve problems scientifically. However, the evaluation of CS projects’ effects on learning outcomes has suffered...

Full description

Bibliographic Details
Main Authors: Till Bruckermann, Tanja M. Straka, Milena Stillfried, Moritz Krell
Format: Article
Language:English
Published: Ubiquity Press 2021-11-01
Series:Citizen Science: Theory and Practice
Subjects:
Online Access:https://theoryandpractice.citizenscienceassociation.org/articles/309
_version_ 1831810856215642112
author Till Bruckermann
Tanja M. Straka
Milena Stillfried
Moritz Krell
author_facet Till Bruckermann
Tanja M. Straka
Milena Stillfried
Moritz Krell
author_sort Till Bruckermann
collection DOAJ
description Citizen science (CS) projects engage citizens for research purposes and promote individual learning outcomes such as scientific reasoning (SR) skills. SR refers to participants’ skills to solve problems scientifically. However, the evaluation of CS projects’ effects on learning outcomes has suffered from a lack of assessment instruments and resources. Assessments of SR have most often been validated in the context of formal education. They do not contextualize items to be authentic or to represent a wide variety of disciplines and contexts in CS research. Here, we describe the development of an assessment instrument that can be flexibly adapted to different CS research contexts. Furthermore, we show that this assessment instrument, the SR questionnaire, provides valid conclusions about participants’ SR skills. We found that the deep-structure and surface features of the items in the SR questionnaire represent the thinking processes associated with SR to a substantial extent. We suggest that practitioners and researchers consider these item features in future adaptations of the SR questionnaire. This will most likely enable them to draw valid conclusions about participants’ SR skills and to gain a deeper understanding of participants’ SR skills in CS project evaluation.
first_indexed 2024-12-22T21:07:13Z
format Article
id doaj.art-2ccaeadb7ecc45fea6b3c02897283fe8
institution Directory Open Access Journal
issn 2057-4991
language English
last_indexed 2024-12-22T21:07:13Z
publishDate 2021-11-01
publisher Ubiquity Press
record_format Article
series Citizen Science: Theory and Practice
spelling doaj.art-2ccaeadb7ecc45fea6b3c02897283fe82022-12-21T18:12:38ZengUbiquity PressCitizen Science: Theory and Practice2057-49912021-11-016110.5334/cstp.309126Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning SkillsTill Bruckermann0Tanja M. Straka1Milena Stillfried2Moritz Krell3Leibniz University Hannover; IPN – Leibniz Institute for Science and Mathematics EducationTechnische Universität BerlinLeibniz Institute for Zoo and Wildlife ResearchIPN – Leibniz Institute for Science and Mathematics EducationCitizen science (CS) projects engage citizens for research purposes and promote individual learning outcomes such as scientific reasoning (SR) skills. SR refers to participants’ skills to solve problems scientifically. However, the evaluation of CS projects’ effects on learning outcomes has suffered from a lack of assessment instruments and resources. Assessments of SR have most often been validated in the context of formal education. They do not contextualize items to be authentic or to represent a wide variety of disciplines and contexts in CS research. Here, we describe the development of an assessment instrument that can be flexibly adapted to different CS research contexts. Furthermore, we show that this assessment instrument, the SR questionnaire, provides valid conclusions about participants’ SR skills. We found that the deep-structure and surface features of the items in the SR questionnaire represent the thinking processes associated with SR to a substantial extent. We suggest that practitioners and researchers consider these item features in future adaptations of the SR questionnaire. This will most likely enable them to draw valid conclusions about participants’ SR skills and to gain a deeper understanding of participants’ SR skills in CS project evaluation.https://theoryandpractice.citizenscienceassociation.org/articles/309scientific reasoningassessmentexplanatory rasch modelevaluationlearning outcomesscience inquiry skills
spellingShingle Till Bruckermann
Tanja M. Straka
Milena Stillfried
Moritz Krell
Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
Citizen Science: Theory and Practice
scientific reasoning
assessment
explanatory rasch model
evaluation
learning outcomes
science inquiry skills
title Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
title_full Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
title_fullStr Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
title_full_unstemmed Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
title_short Context Matters: Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills
title_sort context matters accounting for item features in the assessment of citizen scientists scientific reasoning skills
topic scientific reasoning
assessment
explanatory rasch model
evaluation
learning outcomes
science inquiry skills
url https://theoryandpractice.citizenscienceassociation.org/articles/309
work_keys_str_mv AT tillbruckermann contextmattersaccountingforitemfeaturesintheassessmentofcitizenscientistsscientificreasoningskills
AT tanjamstraka contextmattersaccountingforitemfeaturesintheassessmentofcitizenscientistsscientificreasoningskills
AT milenastillfried contextmattersaccountingforitemfeaturesintheassessmentofcitizenscientistsscientificreasoningskills
AT moritzkrell contextmattersaccountingforitemfeaturesintheassessmentofcitizenscientistsscientificreasoningskills