Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters

Abstract Background The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students’ non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring...

Full description

Bibliographic Details
Main Authors: Katrine Prydz, Peter Dieckmann, Hans Fagertun, David Musson, Torben Wisborg
Format: Article
Language:English
Published: BMC 2023-11-01
Series:BMC Medical Education
Subjects:
Online Access:https://doi.org/10.1186/s12909-023-04837-6
_version_ 1797558936830738432
author Katrine Prydz
Peter Dieckmann
Hans Fagertun
David Musson
Torben Wisborg
author_facet Katrine Prydz
Peter Dieckmann
Hans Fagertun
David Musson
Torben Wisborg
author_sort Katrine Prydz
collection DOAJ
description Abstract Background The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students’ non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring extensive rater training, the tool should capably suit the needs of busy doctors as near-peer educators. The aim of this study was to examine the usability and preliminary assess validity of the NorMS-NTS tool when used by novice raters. Methods This study focused on the usability of the assessment tool and its internal structure. Three raters used the NorMS-NTS tool to individually rate the team leader, a medical student, in 20 video-recorded multi-professional simulation-based team trainings. Based on these ratings, we examined the tools’ internal structure by calculating the intraclass correlation coefficient (ICC) (version 3.1) interrater reliability, internal consistency, and observability. After the rating process was completed, the raters answered a questionnaire about the tool’s usability. Results The ICC agreement and the sum of the overall global scores for all raters were fair: ICC (3,1) = 0.53. The correlation coefficients for the pooled raters were in the range of 0.77–0.91. Cronbach’s alpha for elements, categories and global score were mostly above 0.90. The observability was high (95%-100%). All the raters found the tool easy to use, none of the elements were redundant, and the written instructions were helpful. The raters also found the tool easier to use once they had acclimated to it. All the raters stated that they could use the tool for both training and teaching. Conclusions The observed ICC agreement was 0.08 below the suggested ICC level for formative assessment (above 0.60). However, we know that the suggestion is based on the average ICC, which is always higher than a single-measure ICC. There are currently no suggested levels for single-measure ICC, but other validated NTS tools have single-measure ICC in the same range. We consider NorMS-NTS as a usable tool for formative assessment of Norwegian medical students’ non-technical skills during multi-professional team training by raters who are new to the tool. It is necessary to further examine validity and the consequences of the tool to fully validate it for formative assessments.
first_indexed 2024-03-10T17:38:31Z
format Article
id doaj.art-567058608c0b41428abaefbaddd5faee
institution Directory Open Access Journal
issn 1472-6920
language English
last_indexed 2024-03-10T17:38:31Z
publishDate 2023-11-01
publisher BMC
record_format Article
series BMC Medical Education
spelling doaj.art-567058608c0b41428abaefbaddd5faee2023-11-20T09:46:59ZengBMCBMC Medical Education1472-69202023-11-0123111010.1186/s12909-023-04837-6Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice ratersKatrine Prydz0Peter Dieckmann1Hans Fagertun2David Musson3Torben Wisborg4Interprofessional Rural Research Team, Faculty of Health Sciences, Department of Clinical Medicine, University of Tromsø – the Arctic University of NorwayCopenhagen Academy for Medical Education and Simulation (CAMES), Center for Human Resources and Education, Capital Region of DenmarkCapturo ASFaculty of Health Sciences, Department of Anesthesia, McMaster UniversityInterprofessional Rural Research Team, Faculty of Health Sciences, Department of Clinical Medicine, University of Tromsø – the Arctic University of NorwayAbstract Background The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students’ non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring extensive rater training, the tool should capably suit the needs of busy doctors as near-peer educators. The aim of this study was to examine the usability and preliminary assess validity of the NorMS-NTS tool when used by novice raters. Methods This study focused on the usability of the assessment tool and its internal structure. Three raters used the NorMS-NTS tool to individually rate the team leader, a medical student, in 20 video-recorded multi-professional simulation-based team trainings. Based on these ratings, we examined the tools’ internal structure by calculating the intraclass correlation coefficient (ICC) (version 3.1) interrater reliability, internal consistency, and observability. After the rating process was completed, the raters answered a questionnaire about the tool’s usability. Results The ICC agreement and the sum of the overall global scores for all raters were fair: ICC (3,1) = 0.53. The correlation coefficients for the pooled raters were in the range of 0.77–0.91. Cronbach’s alpha for elements, categories and global score were mostly above 0.90. The observability was high (95%-100%). All the raters found the tool easy to use, none of the elements were redundant, and the written instructions were helpful. The raters also found the tool easier to use once they had acclimated to it. All the raters stated that they could use the tool for both training and teaching. Conclusions The observed ICC agreement was 0.08 below the suggested ICC level for formative assessment (above 0.60). However, we know that the suggestion is based on the average ICC, which is always higher than a single-measure ICC. There are currently no suggested levels for single-measure ICC, but other validated NTS tools have single-measure ICC in the same range. We consider NorMS-NTS as a usable tool for formative assessment of Norwegian medical students’ non-technical skills during multi-professional team training by raters who are new to the tool. It is necessary to further examine validity and the consequences of the tool to fully validate it for formative assessments.https://doi.org/10.1186/s12909-023-04837-6NorMS-NTSNontechnical skillsMedical studentsAssessmentSimulation-based trainingValidation
spellingShingle Katrine Prydz
Peter Dieckmann
Hans Fagertun
David Musson
Torben Wisborg
Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
BMC Medical Education
NorMS-NTS
Nontechnical skills
Medical students
Assessment
Simulation-based training
Validation
title Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
title_full Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
title_fullStr Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
title_full_unstemmed Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
title_short Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters
title_sort collecting evidence of validity for an assessment tool for norwegian medical students non technical skills norms nts usability and reliability when used by novice raters
topic NorMS-NTS
Nontechnical skills
Medical students
Assessment
Simulation-based training
Validation
url https://doi.org/10.1186/s12909-023-04837-6
work_keys_str_mv AT katrineprydz collectingevidenceofvalidityforanassessmenttoolfornorwegianmedicalstudentsnontechnicalskillsnormsntsusabilityandreliabilitywhenusedbynoviceraters
AT peterdieckmann collectingevidenceofvalidityforanassessmenttoolfornorwegianmedicalstudentsnontechnicalskillsnormsntsusabilityandreliabilitywhenusedbynoviceraters
AT hansfagertun collectingevidenceofvalidityforanassessmenttoolfornorwegianmedicalstudentsnontechnicalskillsnormsntsusabilityandreliabilitywhenusedbynoviceraters
AT davidmusson collectingevidenceofvalidityforanassessmenttoolfornorwegianmedicalstudentsnontechnicalskillsnormsntsusabilityandreliabilitywhenusedbynoviceraters
AT torbenwisborg collectingevidenceofvalidityforanassessmenttoolfornorwegianmedicalstudentsnontechnicalskillsnormsntsusabilityandreliabilitywhenusedbynoviceraters