Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation

Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their o...

Cur síos iomlán

Sonraí bibleagrafaíochta
Príomhchruthaitheoirí: Cristian Vilar Giménez, Silvia Krug, Faisal Z. Qureshi, Mattias O’Nils
Formáid: Alt
Teanga:English
Foilsithe / Cruthaithe: MDPI AG 2021-11-01
Sraith:Journal of Imaging
Ábhair:
Rochtain ar líne:https://www.mdpi.com/2313-433X/7/12/255
_version_ 1827671844947230720
author Cristian Vilar Giménez
Silvia Krug
Faisal Z. Qureshi
Mattias O’Nils
author_facet Cristian Vilar Giménez
Silvia Krug
Faisal Z. Qureshi
Mattias O’Nils
author_sort Cristian Vilar Giménez
collection DOAJ
description Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their operation, such as obstacle avoidance or autonomous driving. However, autonomous powered wheelchairs require safe navigation in different environments and scenarios, making their development complex. In our research, we propose, instead, to develop contactless control for powered wheelchairs where the position of the caregiver is used as a control reference. Hence, we used a depth camera to recognize the caregiver and measure at the same time their relative distance from the powered wheelchair. In this paper, we compared two different approaches for real-time object recognition using a 3DHOG hand-crafted object descriptor based on a 3D extension of the histogram of oriented gradients (HOG) and a convolutional neural network based on YOLOv4-Tiny. To evaluate both approaches, we constructed Miun-Feet—a custom dataset of images of labeled caregiver’s feet in different scenarios, with backgrounds, objects, and lighting conditions. The experimental results showed that the YOLOv4-Tiny approach outperformed 3DHOG in all the analyzed cases. In addition, the results showed that the recognition accuracy was not improved using the depth channel, enabling the use of a monocular RGB camera only instead of a depth camera and reducing the computational cost and heat dissipation limitations. Hence, the paper proposes an additional method to compute the caregiver’s distance and angle from the Powered Wheelchair (PW) using only the RGB data. This work shows that it is feasible to use the location of the caregiver’s feet as a control signal for the control of a powered wheelchair and that it is possible to use a monocular RGB camera to compute their relative positions.
first_indexed 2024-03-10T03:49:30Z
format Article
id doaj.art-22d59d3ac56b4f968950c7b901b56962
institution Directory Open Access Journal
issn 2313-433X
language English
last_indexed 2024-03-10T03:49:30Z
publishDate 2021-11-01
publisher MDPI AG
record_format Article
series Journal of Imaging
spelling doaj.art-22d59d3ac56b4f968950c7b901b569622023-11-23T09:00:35ZengMDPI AGJournal of Imaging2313-433X2021-11-0171225510.3390/jimaging7120255Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair NavigationCristian Vilar Giménez0Silvia Krug1Faisal Z. Qureshi2Mattias O’Nils3Department of Electronics Design, Mid Sweden University, Holmgatan 10, 851 70 Sundsvall, SwedenDepartment of Electronics Design, Mid Sweden University, Holmgatan 10, 851 70 Sundsvall, SwedenDepartment of Electronics Design, Mid Sweden University, Holmgatan 10, 851 70 Sundsvall, SwedenDepartment of Electronics Design, Mid Sweden University, Holmgatan 10, 851 70 Sundsvall, SwedenPowered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their operation, such as obstacle avoidance or autonomous driving. However, autonomous powered wheelchairs require safe navigation in different environments and scenarios, making their development complex. In our research, we propose, instead, to develop contactless control for powered wheelchairs where the position of the caregiver is used as a control reference. Hence, we used a depth camera to recognize the caregiver and measure at the same time their relative distance from the powered wheelchair. In this paper, we compared two different approaches for real-time object recognition using a 3DHOG hand-crafted object descriptor based on a 3D extension of the histogram of oriented gradients (HOG) and a convolutional neural network based on YOLOv4-Tiny. To evaluate both approaches, we constructed Miun-Feet—a custom dataset of images of labeled caregiver’s feet in different scenarios, with backgrounds, objects, and lighting conditions. The experimental results showed that the YOLOv4-Tiny approach outperformed 3DHOG in all the analyzed cases. In addition, the results showed that the recognition accuracy was not improved using the depth channel, enabling the use of a monocular RGB camera only instead of a depth camera and reducing the computational cost and heat dissipation limitations. Hence, the paper proposes an additional method to compute the caregiver’s distance and angle from the Powered Wheelchair (PW) using only the RGB data. This work shows that it is feasible to use the location of the caregiver’s feet as a control signal for the control of a powered wheelchair and that it is possible to use a monocular RGB camera to compute their relative positions.https://www.mdpi.com/2313-433X/7/12/2553D object recognitionYOLOYOLO-Tiny3DHOGhistogram of oriented gradientsModelNet40
spellingShingle Cristian Vilar Giménez
Silvia Krug
Faisal Z. Qureshi
Mattias O’Nils
Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
Journal of Imaging
3D object recognition
YOLO
YOLO-Tiny
3DHOG
histogram of oriented gradients
ModelNet40
title Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_full Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_fullStr Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_full_unstemmed Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_short Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_sort evaluation of 2d 3d feet detection methods for semi autonomous powered wheelchair navigation
topic 3D object recognition
YOLO
YOLO-Tiny
3DHOG
histogram of oriented gradients
ModelNet40
url https://www.mdpi.com/2313-433X/7/12/255
work_keys_str_mv AT cristianvilargimenez evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT silviakrug evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT faisalzqureshi evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT mattiasonils evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation