Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video
Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analy...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-05-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/23/10/4839 |
_version_ | 1797598397459333120 |
---|---|
author | Shri Harini Ramesh Edward D. Lemaire Albert Tu Kevin Cheung Natalie Baddour |
author_facet | Shri Harini Ramesh Edward D. Lemaire Albert Tu Kevin Cheung Natalie Baddour |
author_sort | Shri Harini Ramesh |
collection | DOAJ |
description | Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analysis, but human scoring of videos can take over 20 min and requires experienced observers. This research developed an algorithmic implementation of the EVGS from handheld smartphone video to enable automatic scoring. Participant walking was video recorded at 60 Hz using a smartphone, and body keypoints were identified using the OpenPose BODY25 pose estimation model. An algorithm was developed to identify foot events and strides, and EVGS parameters were determined at relevant gait events. Stride detection was accurate within two to five frames. The level of agreement between the algorithmic and human reviewer EVGS results was strong for 14 of 17 parameters, and the algorithmic EVGS results were highly correlated (r > 0.80, “r” represents the Pearson correlation coefficient) to the ground truth values for 8 of the 17 parameters. This approach could make gait analysis more accessible and cost-effective, particularly in areas without gait assessment expertise. These findings pave the way for future studies to explore the use of smartphone video and AI algorithms in remote gait analysis. |
first_indexed | 2024-03-11T03:20:38Z |
format | Article |
id | doaj.art-a136a0927b044a37a2dc2556d79e1583 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-11T03:20:38Z |
publishDate | 2023-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-a136a0927b044a37a2dc2556d79e15832023-11-18T03:13:21ZengMDPI AGSensors1424-82202023-05-012310483910.3390/s23104839Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone VideoShri Harini Ramesh0Edward D. Lemaire1Albert Tu2Kevin Cheung3Natalie Baddour4Department of Mechanical Engineering, University of Ottawa, Ottawa, ON K1N 6N5, CanadaThe Ottawa Hospital Research Institute, Ottawa, ON K1H 8M2, CanadaDepartment of Surgery, Division of Neurosurgery, Children’s Hospital of Eastern Ontario, Ottawa, ON K1H 8L1, CanadaDepartment of Surgery, Division of Plastic Surgery, Children’s Hospital of Eastern Ontario, Ottawa, ON K1H 8L1, CanadaDepartment of Mechanical Engineering, University of Ottawa, Ottawa, ON K1N 6N5, CanadaRecent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analysis, but human scoring of videos can take over 20 min and requires experienced observers. This research developed an algorithmic implementation of the EVGS from handheld smartphone video to enable automatic scoring. Participant walking was video recorded at 60 Hz using a smartphone, and body keypoints were identified using the OpenPose BODY25 pose estimation model. An algorithm was developed to identify foot events and strides, and EVGS parameters were determined at relevant gait events. Stride detection was accurate within two to five frames. The level of agreement between the algorithmic and human reviewer EVGS results was strong for 14 of 17 parameters, and the algorithmic EVGS results were highly correlated (r > 0.80, “r” represents the Pearson correlation coefficient) to the ground truth values for 8 of the 17 parameters. This approach could make gait analysis more accessible and cost-effective, particularly in areas without gait assessment expertise. These findings pave the way for future studies to explore the use of smartphone video and AI algorithms in remote gait analysis.https://www.mdpi.com/1424-8220/23/10/4839gait analysisEdinburgh Visual Gait Scorecomputer visionmotion analysispose estimationsmartphone video |
spellingShingle | Shri Harini Ramesh Edward D. Lemaire Albert Tu Kevin Cheung Natalie Baddour Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video Sensors gait analysis Edinburgh Visual Gait Score computer vision motion analysis pose estimation smartphone video |
title | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_full | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_fullStr | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_full_unstemmed | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_short | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_sort | automated implementation of the edinburgh visual gait score evgs using openpose and handheld smartphone video |
topic | gait analysis Edinburgh Visual Gait Score computer vision motion analysis pose estimation smartphone video |
url | https://www.mdpi.com/1424-8220/23/10/4839 |
work_keys_str_mv | AT shrihariniramesh automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT edwarddlemaire automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT alberttu automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT kevincheung automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT nataliebaddour automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo |