Examining agreement between clinicians when assessing sick children
Background: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs d...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Journal article |
Language: | English |
Published: |
Public Library of Science
2009
|
Subjects: |
_version_ | 1797059238818742272 |
---|---|
author | Wagai, J Senga, J Fegan, G English, M |
author2 | Scherer, R |
author_facet | Scherer, R Wagai, J Senga, J Fegan, G English, M |
author_sort | Wagai, J |
collection | OXFORD |
description | Background: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs defined by expert consensus. Methodology: 104 videos representing 11 clinical sign categories were presented to experts using a web questionnaire. Proportionate agreement and agreement beyond chance were calculated using kappa and the AC1 statistic. 31 videos were selected and presented to local health workers, 20 for which experts had demonstrated clear agreement and 11 for which experts could not demonstrate agreement. Principal findings: Experts reached very high level of chance adjusted for some videos while for a few videos no agreement beyond chance was found. Where experts agreed Kenyan hospital staff of all cadres recognised signs with high mean sensitivity and specificity (sensitivity: 0.897 - 0.975, specificity: 0.813-0.894); years of experience, gender and hospital had no influence on mean sensitivity or specificity. Local health workers did not agree on videos where experts had low or no agreement. Results of different agreement statistics for multiple observers, the AC1 and Fleiss' kappa, differ across the range of proportionate agreement. Conclusion: Videos provide a useful means to test agreement amongst geographically diverse groups of health workers. Kenyan health workers are in agreement with experts where clinical signs are clear-cut supporting the potential value of assessment and management guidelines. However, clinical signs are not always clear-cut. Video recordings offer one means to help standardise interpretation of clinical signs. |
first_indexed | 2024-03-06T20:01:20Z |
format | Journal article |
id | oxford-uuid:2766e218-de86-4d98-a238-0501fe5e091f |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-06T20:01:20Z |
publishDate | 2009 |
publisher | Public Library of Science |
record_format | dspace |
spelling | oxford-uuid:2766e218-de86-4d98-a238-0501fe5e091f2022-03-26T12:06:46ZExamining agreement between clinicians when assessing sick childrenJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:2766e218-de86-4d98-a238-0501fe5e091fOrganisation and evaluation of medical carePaediatricsInfectious diseasesEnglishOxford University Research Archive - ValetPublic Library of Science2009Wagai, JSenga, JFegan, GEnglish, MScherer, RBackground: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs defined by expert consensus. Methodology: 104 videos representing 11 clinical sign categories were presented to experts using a web questionnaire. Proportionate agreement and agreement beyond chance were calculated using kappa and the AC1 statistic. 31 videos were selected and presented to local health workers, 20 for which experts had demonstrated clear agreement and 11 for which experts could not demonstrate agreement. Principal findings: Experts reached very high level of chance adjusted for some videos while for a few videos no agreement beyond chance was found. Where experts agreed Kenyan hospital staff of all cadres recognised signs with high mean sensitivity and specificity (sensitivity: 0.897 - 0.975, specificity: 0.813-0.894); years of experience, gender and hospital had no influence on mean sensitivity or specificity. Local health workers did not agree on videos where experts had low or no agreement. Results of different agreement statistics for multiple observers, the AC1 and Fleiss' kappa, differ across the range of proportionate agreement. Conclusion: Videos provide a useful means to test agreement amongst geographically diverse groups of health workers. Kenyan health workers are in agreement with experts where clinical signs are clear-cut supporting the potential value of assessment and management guidelines. However, clinical signs are not always clear-cut. Video recordings offer one means to help standardise interpretation of clinical signs. |
spellingShingle | Organisation and evaluation of medical care Paediatrics Infectious diseases Wagai, J Senga, J Fegan, G English, M Examining agreement between clinicians when assessing sick children |
title | Examining agreement between clinicians when assessing sick children |
title_full | Examining agreement between clinicians when assessing sick children |
title_fullStr | Examining agreement between clinicians when assessing sick children |
title_full_unstemmed | Examining agreement between clinicians when assessing sick children |
title_short | Examining agreement between clinicians when assessing sick children |
title_sort | examining agreement between clinicians when assessing sick children |
topic | Organisation and evaluation of medical care Paediatrics Infectious diseases |
work_keys_str_mv | AT wagaij examiningagreementbetweenclinicianswhenassessingsickchildren AT sengaj examiningagreementbetweenclinicianswhenassessingsickchildren AT fegang examiningagreementbetweenclinicianswhenassessingsickchildren AT englishm examiningagreementbetweenclinicianswhenassessingsickchildren |