Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise
Abstract Introduction Clinical reasoning (CR) is a complex skill enabling transition from clinical novice to expert decision maker. The Objective Structured Clinical Examination (OSCE) is widely used to evaluate clinical competency, though there is limited literature exploring how this assessment is...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
BMC
2023-10-01
|
Series: | BMC Medical Education |
Subjects: | |
Online Access: | https://doi.org/10.1186/s12909-023-04668-5 |
_version_ | 1797558915020357632 |
---|---|
author | Alexis Régent Harish Thampy Mini Singh |
author_facet | Alexis Régent Harish Thampy Mini Singh |
author_sort | Alexis Régent |
collection | DOAJ |
description | Abstract Introduction Clinical reasoning (CR) is a complex skill enabling transition from clinical novice to expert decision maker. The Objective Structured Clinical Examination (OSCE) is widely used to evaluate clinical competency, though there is limited literature exploring how this assessment is best used to assess CR skills. This proof-of-concept study explored the creation and pilot testing of a post-station CR assessment, named Oral Debrief (OD), in the context of undergraduate medical education. Methods A modified-Delphi technique was used to create a standardised domain-based OD marking rubric encapsulating the key skills of CR that drew upon existing literature and our existing placement-based CR tool. 16 OSCE examiners were recruited to score three simulated OD recordings that were scripted to portray differing levels of competency. Adopting a think-aloud approach, examiners vocalised their thought processes while utilising the rubric to assess each video. Thereafter, semi-structured interviews explored examiners’ views on the OD approach. Recordings were transcribed, anonymised and analysed deductively and inductively for recurring themes. Additionally, inter-rater agreement of examiners’ scoring was determined using the Fleiss Kappa statistic both within group and in comparison to a reference examiner group. Results The rubric achieved fair to good levels of inter-rater reliability metrics across its constituent domains and overall global judgement scales. Think-aloud scoring revealed that participating examiners considered several factors when scoring students’ CR abilities. This included the adoption of a confident structured approach, discriminating between relevant and less-relevant information, and the ability to prioritise and justify decision making. Furthermore, students’ CR skills were judged in light of potential risks to patient safety and examiners’ own illness scripts. Feedback from examiners indicated that whilst additional training in rubric usage would be beneficial, OD offered a positive approach for examining CR ability. Conclusion This pilot study has demonstrated promising results for the use of a novel post-station OD task to evaluate medical students’ CR ability in the OSCE setting. Further work is now planned to evaluate how the OD approach can most effectively be implemented into routine assessment practice. |
first_indexed | 2024-03-10T17:38:10Z |
format | Article |
id | doaj.art-0a9d61a11e254bbba85f5623f4dfbc7b |
institution | Directory Open Access Journal |
issn | 1472-6920 |
language | English |
last_indexed | 2024-03-10T17:38:10Z |
publishDate | 2023-10-01 |
publisher | BMC |
record_format | Article |
series | BMC Medical Education |
spelling | doaj.art-0a9d61a11e254bbba85f5623f4dfbc7b2023-11-20T09:46:46ZengBMCBMC Medical Education1472-69202023-10-0123111310.1186/s12909-023-04668-5Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exerciseAlexis Régent0Harish Thampy1Mini Singh2Service de médecine interne, Centre de référence maladies auto-immunes et systémiques rares d’ile de France, APHP-CUP, Hôpital CochinDivision of Medical Education, Faculty of Medicine, Biology and Health, University of ManchesterDivision of Medical Education, Faculty of Medicine, Biology and Health, University of ManchesterAbstract Introduction Clinical reasoning (CR) is a complex skill enabling transition from clinical novice to expert decision maker. The Objective Structured Clinical Examination (OSCE) is widely used to evaluate clinical competency, though there is limited literature exploring how this assessment is best used to assess CR skills. This proof-of-concept study explored the creation and pilot testing of a post-station CR assessment, named Oral Debrief (OD), in the context of undergraduate medical education. Methods A modified-Delphi technique was used to create a standardised domain-based OD marking rubric encapsulating the key skills of CR that drew upon existing literature and our existing placement-based CR tool. 16 OSCE examiners were recruited to score three simulated OD recordings that were scripted to portray differing levels of competency. Adopting a think-aloud approach, examiners vocalised their thought processes while utilising the rubric to assess each video. Thereafter, semi-structured interviews explored examiners’ views on the OD approach. Recordings were transcribed, anonymised and analysed deductively and inductively for recurring themes. Additionally, inter-rater agreement of examiners’ scoring was determined using the Fleiss Kappa statistic both within group and in comparison to a reference examiner group. Results The rubric achieved fair to good levels of inter-rater reliability metrics across its constituent domains and overall global judgement scales. Think-aloud scoring revealed that participating examiners considered several factors when scoring students’ CR abilities. This included the adoption of a confident structured approach, discriminating between relevant and less-relevant information, and the ability to prioritise and justify decision making. Furthermore, students’ CR skills were judged in light of potential risks to patient safety and examiners’ own illness scripts. Feedback from examiners indicated that whilst additional training in rubric usage would be beneficial, OD offered a positive approach for examining CR ability. Conclusion This pilot study has demonstrated promising results for the use of a novel post-station OD task to evaluate medical students’ CR ability in the OSCE setting. Further work is now planned to evaluate how the OD approach can most effectively be implemented into routine assessment practice.https://doi.org/10.1186/s12909-023-04668-5Clinical reasoning assessmentOral debriefObjective structured clinical examination |
spellingShingle | Alexis Régent Harish Thampy Mini Singh Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise BMC Medical Education Clinical reasoning assessment Oral debrief Objective structured clinical examination |
title | Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise |
title_full | Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise |
title_fullStr | Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise |
title_full_unstemmed | Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise |
title_short | Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise |
title_sort | assessing clinical reasoning in the osce pilot testing a novel oral debrief exercise |
topic | Clinical reasoning assessment Oral debrief Objective structured clinical examination |
url | https://doi.org/10.1186/s12909-023-04668-5 |
work_keys_str_mv | AT alexisregent assessingclinicalreasoningintheoscepilottestinganoveloraldebriefexercise AT harishthampy assessingclinicalreasoningintheoscepilottestinganoveloraldebriefexercise AT minisingh assessingclinicalreasoningintheoscepilottestinganoveloraldebriefexercise |