Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE)
Abstract Purpose Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
BMC
2023-10-01
|
Series: | BMC Medical Education |
Subjects: | |
Online Access: | https://doi.org/10.1186/s12909-023-04774-4 |
_version_ | 1797647294053482496 |
---|---|
author | Peter Yeates Adriano Maluf Natalie Cope Gareth McCray Stuart McBain Dominic Beardow Richard Fuller Robert Bob McKinley |
author_facet | Peter Yeates Adriano Maluf Natalie Cope Gareth McCray Stuart McBain Dominic Beardow Richard Fuller Robert Bob McKinley |
author_sort | Peter Yeates |
collection | DOAJ |
description | Abstract Purpose Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students’ scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student’s pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. |
first_indexed | 2024-03-11T15:15:12Z |
format | Article |
id | doaj.art-dadc9472cbce41038bb81547b72549cc |
institution | Directory Open Access Journal |
issn | 1472-6920 |
language | English |
last_indexed | 2024-03-11T15:15:12Z |
publishDate | 2023-10-01 |
publisher | BMC |
record_format | Article |
series | BMC Medical Education |
spelling | doaj.art-dadc9472cbce41038bb81547b72549cc2023-10-29T12:27:32ZengBMCBMC Medical Education1472-69202023-10-0123111210.1186/s12909-023-04774-4Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE)Peter Yeates0Adriano Maluf1Natalie Cope2Gareth McCray3Stuart McBain4Dominic Beardow5Richard Fuller6Robert Bob McKinley7School of Medicine, Keele UniversitySchool of Medicine, Keele UniversitySchool of Medicine, Keele UniversitySchool of Medicine, Keele UniversitySchool of Medicine, Keele UniversitySchool of Medicine, Keele UniversityChristie Education, Christie Hospitals NHS Foundation TrustSchool of Medicine, Keele UniversityAbstract Purpose Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students’ scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student’s pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs.https://doi.org/10.1186/s12909-023-04774-4OSCEAssessmentEquivalenceExaminer-CohortsDistributed Assessment |
spellingShingle | Peter Yeates Adriano Maluf Natalie Cope Gareth McCray Stuart McBain Dominic Beardow Richard Fuller Robert Bob McKinley Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) BMC Medical Education OSCE Assessment Equivalence Examiner-Cohorts Distributed Assessment |
title | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_full | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_fullStr | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_full_unstemmed | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_short | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_sort | using video based examiner score comparison and adjustment vesca to compare the influence of examiners at different sites in a distributed objective structured clinical exam osce |
topic | OSCE Assessment Equivalence Examiner-Cohorts Distributed Assessment |
url | https://doi.org/10.1186/s12909-023-04774-4 |
work_keys_str_mv | AT peteryeates usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT adrianomaluf usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT nataliecope usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT garethmccray usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT stuartmcbain usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT dominicbeardow usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT richardfuller usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT robertbobmckinley usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce |