Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People

Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very f...

Full description

Bibliographic Details
Main Authors: Jinqiang Bai, Zhaoxiang Liu, Yimin Lin, Ye Li, Shiguo Lian, Dijun Liu
Format: Article
Language:English
Published: MDPI AG 2019-06-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/8/6/697
_version_ 1798041601739587584
author Jinqiang Bai
Zhaoxiang Liu
Yimin Lin
Ye Li
Shiguo Lian
Dijun Liu
author_facet Jinqiang Bai
Zhaoxiang Liu
Yimin Lin
Ye Li
Shiguo Lian
Dijun Liu
author_sort Jinqiang Bai
collection DOAJ
description Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human−machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.
first_indexed 2024-04-11T22:23:48Z
format Article
id doaj.art-770d05b861594714b02235d351aa206f
institution Directory Open Access Journal
issn 2079-9292
language English
last_indexed 2024-04-11T22:23:48Z
publishDate 2019-06-01
publisher MDPI AG
record_format Article
series Electronics
spelling doaj.art-770d05b861594714b02235d351aa206f2022-12-22T03:59:56ZengMDPI AGElectronics2079-92922019-06-018669710.3390/electronics8060697electronics8060697Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired PeopleJinqiang Bai0Zhaoxiang Liu1Yimin Lin2Ye Li3Shiguo Lian4Dijun Liu5School of Electronic Information Engineering, Beihang University, No. 37, Xueyuan Rd., Haidian Distrct, Beijing 10083, ChinaDepartment of AI, CloudMinds Technologies Inc., Beijing 100102, ChinaDepartment of AI, CloudMinds Technologies Inc., Beijing 100102, ChinaDepartment of AI, CloudMinds Technologies Inc., Beijing 100102, ChinaDepartment of AI, CloudMinds Technologies Inc., Beijing 100102, ChinaChina Academy of Telecommunication Technology, Beijing 10083, ChinaAssistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human−machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.https://www.mdpi.com/2079-9292/8/6/697wearable assistive deviceblind navigationobject recognitionvisually impaired peopleground segmentation
spellingShingle Jinqiang Bai
Zhaoxiang Liu
Yimin Lin
Ye Li
Shiguo Lian
Dijun Liu
Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
Electronics
wearable assistive device
blind navigation
object recognition
visually impaired people
ground segmentation
title Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
title_full Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
title_fullStr Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
title_full_unstemmed Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
title_short Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
title_sort wearable travel aid for environment perception and navigation of visually impaired people
topic wearable assistive device
blind navigation
object recognition
visually impaired people
ground segmentation
url https://www.mdpi.com/2079-9292/8/6/697
work_keys_str_mv AT jinqiangbai wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople
AT zhaoxiangliu wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople
AT yiminlin wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople
AT yeli wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople
AT shiguolian wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople
AT dijunliu wearabletravelaidforenvironmentperceptionandnavigationofvisuallyimpairedpeople