Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface
PurposeTo evaluate repeatability, reproducibility, and accordance between ocular surface measurements within three different imaging devices.MethodsWe performed an observational study on 66 healthy eyes. Tear meniscus height, non-invasive tear break-up time (NITBUT) and meibography were measured usi...
Main Authors: | , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-07-01
|
Series: | Frontiers in Medicine |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fmed.2022.893688/full |
_version_ | 1818017608452014080 |
---|---|
author | Abril L. Garcia-Terraza Abril L. Garcia-Terraza David Jimenez-Collado David Jimenez-Collado Francisco Sanchez-Sanoja Francisco Sanchez-Sanoja José Y. Arteaga-Rivera Norma Morales Flores Sofía Pérez-Solórzano Yonathan Garfias Enrique O. Graue-Hernández Alejandro Navas |
author_facet | Abril L. Garcia-Terraza Abril L. Garcia-Terraza David Jimenez-Collado David Jimenez-Collado Francisco Sanchez-Sanoja Francisco Sanchez-Sanoja José Y. Arteaga-Rivera Norma Morales Flores Sofía Pérez-Solórzano Yonathan Garfias Enrique O. Graue-Hernández Alejandro Navas |
author_sort | Abril L. Garcia-Terraza |
collection | DOAJ |
description | PurposeTo evaluate repeatability, reproducibility, and accordance between ocular surface measurements within three different imaging devices.MethodsWe performed an observational study on 66 healthy eyes. Tear meniscus height, non-invasive tear break-up time (NITBUT) and meibography were measured using three corneal imaging devices: Keratograph 5M (Oculus, Wetzlar, Germany), Antares (Lumenis, Sidney, Australia), and LacryDiag (Quantel Medical, Cournon d’Auvergne, France). One-way ANOVAs with post hoc analyses were used to calculate accordance between the tear meniscus and NITBUT. Reproducibility was assessed through coefficients of variation and repeatability with intraclass correlation coefficients (ICC). Reliability of meibography classification was analyzed by calculating Fleiss’ Kappa Index and presented in Venn diagrams.ResultsCoefficients of variation were high and differed greatly depending on the device and measurement. ICCs showed moderate reliability of NITBUT and tear meniscus height measurements. We observed discordance between measurements of tear meniscus height between the three devices, F2, 195 = 15.24, p < 0.01. Measurements performed with Antares were higher; 0.365 ± 0.0851, than those with Keratograph 5M and LacryDiag; 0.293 ± 0.0790 and 0.306 ± 0.0731. NITBUT also showed discordance between devices, F2, 111 = 13.152, p < 0.01. Measurements performed with LacryDiag were lower (10.4 ± 1.82) compared to those of Keratograph 5M (12.6 ± 4.01) and Antares (12.6 ± 4.21). Fleiss’ Kappa showed a value of -0.00487 for upper lid and 0.128 for inferior lid Meibography classification, suggesting discrete to poor agreement between measurements.ConclusionDepending on the device used and parameter analyzed, measurements varied between each other, showing a difference in image processing. |
first_indexed | 2024-04-14T07:29:12Z |
format | Article |
id | doaj.art-6da02ca531bb4ef783135d834775cae0 |
institution | Directory Open Access Journal |
issn | 2296-858X |
language | English |
last_indexed | 2024-04-14T07:29:12Z |
publishDate | 2022-07-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Medicine |
spelling | doaj.art-6da02ca531bb4ef783135d834775cae02022-12-22T02:05:55ZengFrontiers Media S.A.Frontiers in Medicine2296-858X2022-07-01910.3389/fmed.2022.893688893688Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surfaceAbril L. Garcia-Terraza0Abril L. Garcia-Terraza1David Jimenez-Collado2David Jimenez-Collado3Francisco Sanchez-Sanoja4Francisco Sanchez-Sanoja5José Y. Arteaga-Rivera6Norma Morales Flores7Sofía Pérez-Solórzano8Yonathan Garfias9Enrique O. Graue-Hernández10Alejandro Navas11Department of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoFaculty of Medicine, Autonomous University of Baja California, Mexicali, Baja California, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoSchool of Medicine, Panamerican University, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoFaculty of Health Sciences North Campus, Anáhuac University, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoDepartment of Cornea and Refractive Surgery, Conde de Valenciana Institute of Ophthalmology, Mexico City, MexicoPurposeTo evaluate repeatability, reproducibility, and accordance between ocular surface measurements within three different imaging devices.MethodsWe performed an observational study on 66 healthy eyes. Tear meniscus height, non-invasive tear break-up time (NITBUT) and meibography were measured using three corneal imaging devices: Keratograph 5M (Oculus, Wetzlar, Germany), Antares (Lumenis, Sidney, Australia), and LacryDiag (Quantel Medical, Cournon d’Auvergne, France). One-way ANOVAs with post hoc analyses were used to calculate accordance between the tear meniscus and NITBUT. Reproducibility was assessed through coefficients of variation and repeatability with intraclass correlation coefficients (ICC). Reliability of meibography classification was analyzed by calculating Fleiss’ Kappa Index and presented in Venn diagrams.ResultsCoefficients of variation were high and differed greatly depending on the device and measurement. ICCs showed moderate reliability of NITBUT and tear meniscus height measurements. We observed discordance between measurements of tear meniscus height between the three devices, F2, 195 = 15.24, p < 0.01. Measurements performed with Antares were higher; 0.365 ± 0.0851, than those with Keratograph 5M and LacryDiag; 0.293 ± 0.0790 and 0.306 ± 0.0731. NITBUT also showed discordance between devices, F2, 111 = 13.152, p < 0.01. Measurements performed with LacryDiag were lower (10.4 ± 1.82) compared to those of Keratograph 5M (12.6 ± 4.01) and Antares (12.6 ± 4.21). Fleiss’ Kappa showed a value of -0.00487 for upper lid and 0.128 for inferior lid Meibography classification, suggesting discrete to poor agreement between measurements.ConclusionDepending on the device used and parameter analyzed, measurements varied between each other, showing a difference in image processing.https://www.frontiersin.org/articles/10.3389/fmed.2022.893688/fulldry eye diseasediagnostic imagingocular surfacetopographydiagnosis |
spellingShingle | Abril L. Garcia-Terraza Abril L. Garcia-Terraza David Jimenez-Collado David Jimenez-Collado Francisco Sanchez-Sanoja Francisco Sanchez-Sanoja José Y. Arteaga-Rivera Norma Morales Flores Sofía Pérez-Solórzano Yonathan Garfias Enrique O. Graue-Hernández Alejandro Navas Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface Frontiers in Medicine dry eye disease diagnostic imaging ocular surface topography diagnosis |
title | Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
title_full | Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
title_fullStr | Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
title_full_unstemmed | Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
title_short | Reliability, repeatability, and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
title_sort | reliability repeatability and accordance between three different corneal diagnostic imaging devices for evaluating the ocular surface |
topic | dry eye disease diagnostic imaging ocular surface topography diagnosis |
url | https://www.frontiersin.org/articles/10.3389/fmed.2022.893688/full |
work_keys_str_mv | AT abrillgarciaterraza reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT abrillgarciaterraza reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT davidjimenezcollado reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT davidjimenezcollado reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT franciscosanchezsanoja reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT franciscosanchezsanoja reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT joseyarteagarivera reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT normamoralesflores reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT sofiaperezsolorzano reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT yonathangarfias reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT enriqueograuehernandez reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface AT alejandronavas reliabilityrepeatabilityandaccordancebetweenthreedifferentcornealdiagnosticimagingdevicesforevaluatingtheocularsurface |