Methodological Problems With Online Concussion Testing
Reaction time testing is widely used in online computerized concussion assessments, and most concussion studies utilizing the metric have demonstrated varying degrees of difference between concussed and non-concussed individuals. The problem with most of these online concussion assessments is that t...
Main Authors: | , , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2020-10-01
|
Series: | Frontiers in Human Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fnhum.2020.509091/full |
_version_ | 1811214304631848960 |
---|---|
author | Jameson Holden Eric Francisco Anna Tommerdahl Rachel Lensch Bryan Kirsch Laila Zai Alan J. Pearce Oleg V. Favorov Robert G. Dennis Robert G. Dennis Mark Tommerdahl Mark Tommerdahl |
author_facet | Jameson Holden Eric Francisco Anna Tommerdahl Rachel Lensch Bryan Kirsch Laila Zai Alan J. Pearce Oleg V. Favorov Robert G. Dennis Robert G. Dennis Mark Tommerdahl Mark Tommerdahl |
author_sort | Jameson Holden |
collection | DOAJ |
description | Reaction time testing is widely used in online computerized concussion assessments, and most concussion studies utilizing the metric have demonstrated varying degrees of difference between concussed and non-concussed individuals. The problem with most of these online concussion assessments is that they predominantly rely on consumer grade technology. Typical administration of these reaction time tests involves presenting a visual stimulus on a computer monitor and prompting the test subject to respond as quickly as possible via keypad or computer mouse. However, inherent delays and variabilities are introduced to the reaction time measure by both computer and associated operating systems that the concussion assessment tool is installed on. The authors hypothesized systems that are typically used to collect concussion reaction time data would demonstrate significant errors in reaction time measurements. To remove human bias, a series of experiments was conducted robotically to assess timing errors introduced by reaction time tests under four different conditions. In the first condition, a visual reaction time test was conducted by flashing a visual stimulus on a computer monitor. Detection was via photodiode and mechanical response was delivered via computer mouse. The second condition employed a mobile device for the visual stimulus, and the mechanical response was delivered to the mobile device's touchscreen. The third condition simulated a tactile reaction time test, and mechanical response was delivered via computer mouse. The fourth condition also simulated a tactile reaction time test, but response was delivered to a dedicated device designed to store the interval between stimulus delivery and response, thus bypassing any problems hypothesized to be introduced by computer and/or computer software. There were significant differences in the range of responses recorded from the four different conditions with the reaction time collected from visual stimulus on a mobile device being the worst and the device with dedicated hardware designed for the task being the best. The results suggest that some of the commonly used visual tasks on consumer grade computers could be (and have been) introducing significant errors for reaction time testing and that dedicated hardware designed for the reaction time task is needed to minimize testing errors. |
first_indexed | 2024-04-12T06:01:02Z |
format | Article |
id | doaj.art-f3fd9aea02794b01a074d4fa2a6b745d |
institution | Directory Open Access Journal |
issn | 1662-5161 |
language | English |
last_indexed | 2024-04-12T06:01:02Z |
publishDate | 2020-10-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Human Neuroscience |
spelling | doaj.art-f3fd9aea02794b01a074d4fa2a6b745d2022-12-22T03:45:02ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612020-10-011410.3389/fnhum.2020.509091509091Methodological Problems With Online Concussion TestingJameson Holden0Eric Francisco1Anna Tommerdahl2Rachel Lensch3Bryan Kirsch4Laila Zai5Alan J. Pearce6Oleg V. Favorov7Robert G. Dennis8Robert G. Dennis9Mark Tommerdahl10Mark Tommerdahl11Cortical Metrics LLC, Carrboro, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesLucent Research, Denver, CO, United StatesCollege of Health Science and Engineering, LaTrobe University, Melbourne, VIC, AustraliaDepartment of Biomedical Engineering, The University of North Carolina at Chapel Hill, Chapel Hill, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesDepartment of Biomedical Engineering, The University of North Carolina at Chapel Hill, Chapel Hill, NC, United StatesCortical Metrics LLC, Carrboro, NC, United StatesDepartment of Biomedical Engineering, The University of North Carolina at Chapel Hill, Chapel Hill, NC, United StatesReaction time testing is widely used in online computerized concussion assessments, and most concussion studies utilizing the metric have demonstrated varying degrees of difference between concussed and non-concussed individuals. The problem with most of these online concussion assessments is that they predominantly rely on consumer grade technology. Typical administration of these reaction time tests involves presenting a visual stimulus on a computer monitor and prompting the test subject to respond as quickly as possible via keypad or computer mouse. However, inherent delays and variabilities are introduced to the reaction time measure by both computer and associated operating systems that the concussion assessment tool is installed on. The authors hypothesized systems that are typically used to collect concussion reaction time data would demonstrate significant errors in reaction time measurements. To remove human bias, a series of experiments was conducted robotically to assess timing errors introduced by reaction time tests under four different conditions. In the first condition, a visual reaction time test was conducted by flashing a visual stimulus on a computer monitor. Detection was via photodiode and mechanical response was delivered via computer mouse. The second condition employed a mobile device for the visual stimulus, and the mechanical response was delivered to the mobile device's touchscreen. The third condition simulated a tactile reaction time test, and mechanical response was delivered via computer mouse. The fourth condition also simulated a tactile reaction time test, but response was delivered to a dedicated device designed to store the interval between stimulus delivery and response, thus bypassing any problems hypothesized to be introduced by computer and/or computer software. There were significant differences in the range of responses recorded from the four different conditions with the reaction time collected from visual stimulus on a mobile device being the worst and the device with dedicated hardware designed for the task being the best. The results suggest that some of the commonly used visual tasks on consumer grade computers could be (and have been) introducing significant errors for reaction time testing and that dedicated hardware designed for the reaction time task is needed to minimize testing errors.https://www.frontiersin.org/article/10.3389/fnhum.2020.509091/fullreaction timereaction time variabilityonline cognitive testingonline concussion testingintraindividual reaction time variabilityconcussion |
spellingShingle | Jameson Holden Eric Francisco Anna Tommerdahl Rachel Lensch Bryan Kirsch Laila Zai Alan J. Pearce Oleg V. Favorov Robert G. Dennis Robert G. Dennis Mark Tommerdahl Mark Tommerdahl Methodological Problems With Online Concussion Testing Frontiers in Human Neuroscience reaction time reaction time variability online cognitive testing online concussion testing intraindividual reaction time variability concussion |
title | Methodological Problems With Online Concussion Testing |
title_full | Methodological Problems With Online Concussion Testing |
title_fullStr | Methodological Problems With Online Concussion Testing |
title_full_unstemmed | Methodological Problems With Online Concussion Testing |
title_short | Methodological Problems With Online Concussion Testing |
title_sort | methodological problems with online concussion testing |
topic | reaction time reaction time variability online cognitive testing online concussion testing intraindividual reaction time variability concussion |
url | https://www.frontiersin.org/article/10.3389/fnhum.2020.509091/full |
work_keys_str_mv | AT jamesonholden methodologicalproblemswithonlineconcussiontesting AT ericfrancisco methodologicalproblemswithonlineconcussiontesting AT annatommerdahl methodologicalproblemswithonlineconcussiontesting AT rachellensch methodologicalproblemswithonlineconcussiontesting AT bryankirsch methodologicalproblemswithonlineconcussiontesting AT lailazai methodologicalproblemswithonlineconcussiontesting AT alanjpearce methodologicalproblemswithonlineconcussiontesting AT olegvfavorov methodologicalproblemswithonlineconcussiontesting AT robertgdennis methodologicalproblemswithonlineconcussiontesting AT robertgdennis methodologicalproblemswithonlineconcussiontesting AT marktommerdahl methodologicalproblemswithonlineconcussiontesting AT marktommerdahl methodologicalproblemswithonlineconcussiontesting |