A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM

Guidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodo...

Full description

Bibliographic Details
Main Authors: Zaipeng Xie, Zhaobin Li, Yida Zhang, Jianan Zhang, Fangming Liu, Wei Chen
Format: Article
Language:English
Published: MDPI AG 2022-07-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/13/7/343
_version_ 1797433454384644096
author Zaipeng Xie
Zhaobin Li
Yida Zhang
Jianan Zhang
Fangming Liu
Wei Chen
author_facet Zaipeng Xie
Zhaobin Li
Yida Zhang
Jianan Zhang
Fangming Liu
Wei Chen
author_sort Zaipeng Xie
collection DOAJ
description Guidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodologies can be inadequate in a complex indoor environment. This paper proposes a multi-sensory guidance system for the visually impaired that can provide tactile and auditory advice using ORB-SLAM and YOLO techniques. Based on an RGB-D camera, the local obstacle avoidance system is realized at the tactile level through point cloud filtering that can inform the user via a vibrating motor. Our proposed method can generate a dense navigation map to implement global obstacle avoidance and path planning for the user through the coordinate transformation. Real-time target detection and a voice-prompt system based on YOLO are also incorporated at the auditory level. We implemented the proposed system as a smart cane. Experiments are performed using four different test scenarios. Experimental results demonstrate that the impediments in the walking path can be reliably located and classified in real-time. Our proposed system can function as a capable auxiliary to help visually impaired people navigate securely by integrating YOLO with ORB-SLAM.
first_indexed 2024-03-09T10:17:16Z
format Article
id doaj.art-5ac782c9252c42f4a62d1b4ffe131cbd
institution Directory Open Access Journal
issn 2078-2489
language English
last_indexed 2024-03-09T10:17:16Z
publishDate 2022-07-01
publisher MDPI AG
record_format Article
series Information
spelling doaj.art-5ac782c9252c42f4a62d1b4ffe131cbd2023-12-01T22:16:58ZengMDPI AGInformation2078-24892022-07-0113734310.3390/info13070343A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAMZaipeng Xie0Zhaobin Li1Yida Zhang2Jianan Zhang3Fangming Liu4Wei Chen5Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Nanjing 211100, ChinaDepartment of Computer Science and Technology, Hohai University, Nanjing 211100, ChinaDepartment of Computer Science and Technology, Hohai University, Nanjing 211100, ChinaDepartment of Computer Science and Technology, Hohai University, Nanjing 211100, ChinaDepartment of Computer Science and Technology, Hohai University, Nanjing 211100, ChinaDepartment of Computer Science and Technology, Hohai University, Nanjing 211100, ChinaGuidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodologies can be inadequate in a complex indoor environment. This paper proposes a multi-sensory guidance system for the visually impaired that can provide tactile and auditory advice using ORB-SLAM and YOLO techniques. Based on an RGB-D camera, the local obstacle avoidance system is realized at the tactile level through point cloud filtering that can inform the user via a vibrating motor. Our proposed method can generate a dense navigation map to implement global obstacle avoidance and path planning for the user through the coordinate transformation. Real-time target detection and a voice-prompt system based on YOLO are also incorporated at the auditory level. We implemented the proposed system as a smart cane. Experiments are performed using four different test scenarios. Experimental results demonstrate that the impediments in the walking path can be reliably located and classified in real-time. Our proposed system can function as a capable auxiliary to help visually impaired people navigate securely by integrating YOLO with ORB-SLAM.https://www.mdpi.com/2078-2489/13/7/343multi-sensory guidance systemvisually impairedORB-SLAMYOLOpoint map buildingindoor navigation
spellingShingle Zaipeng Xie
Zhaobin Li
Yida Zhang
Jianan Zhang
Fangming Liu
Wei Chen
A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
Information
multi-sensory guidance system
visually impaired
ORB-SLAM
YOLO
point map building
indoor navigation
title A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
title_full A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
title_fullStr A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
title_full_unstemmed A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
title_short A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM
title_sort multi sensory guidance system for the visually impaired using yolo and orb slam
topic multi-sensory guidance system
visually impaired
ORB-SLAM
YOLO
point map building
indoor navigation
url https://www.mdpi.com/2078-2489/13/7/343
work_keys_str_mv AT zaipengxie amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT zhaobinli amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT yidazhang amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT jiananzhang amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT fangmingliu amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT weichen amultisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT zaipengxie multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT zhaobinli multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT yidazhang multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT jiananzhang multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT fangmingliu multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam
AT weichen multisensoryguidancesystemforthevisuallyimpairedusingyoloandorbslam