LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired

Over a billion people around the world are disabled, among whom 253 million are visually impaired or blind, and this number is greatly increasing due to ageing, chronic diseases, and poor environments and health. Despite many proposals, the current devices and systems lack maturity and do not comple...

Full description

Bibliographic Details
Main Authors: Sahar Busaeed, Iyad Katib, Aiiad Albeshri, Juan M. Corchado, Tan Yigitcanlar, Rashid Mehmood
Format: Article
Language:English
Published: MDPI AG 2022-09-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/19/7435
_version_ 1797476886171877376
author Sahar Busaeed
Iyad Katib
Aiiad Albeshri
Juan M. Corchado
Tan Yigitcanlar
Rashid Mehmood
author_facet Sahar Busaeed
Iyad Katib
Aiiad Albeshri
Juan M. Corchado
Tan Yigitcanlar
Rashid Mehmood
author_sort Sahar Busaeed
collection DOAJ
description Over a billion people around the world are disabled, among whom 253 million are visually impaired or blind, and this number is greatly increasing due to ageing, chronic diseases, and poor environments and health. Despite many proposals, the current devices and systems lack maturity and do not completely fulfill user requirements and satisfaction. Increased research activity in this field is required in order to encourage the development, commercialization, and widespread acceptance of low-cost and affordable assistive technologies for visual impairment and other disabilities. This paper proposes a novel approach using a LiDAR with a servo motor and an ultrasonic sensor to collect data and predict objects using deep learning for environment perception and navigation. We adopted this approach using a pair of smart glasses, called LidSonic V2.0, to enable the identification of obstacles for the visually impaired. The LidSonic system consists of an Arduino Uno edge computing device integrated into the smart glasses and a smartphone app that transmits data via Bluetooth. Arduino gathers data, operates the sensors on the smart glasses, detects obstacles using simple data processing, and provides buzzer feedback to visually impaired users. The smartphone application collects data from Arduino, detects and classifies items in the spatial environment, and gives spoken feedback to the user on the detected objects. In comparison to image-processing-based glasses, LidSonic uses far less processing time and energy to classify obstacles using simple LiDAR data, according to several integer measurements. We comprehensively describe the proposed system’s hardware and software design, having constructed their prototype implementations and tested them in real-world environments. Using the open platforms, WEKA and TensorFlow, the entire LidSonic system is built with affordable off-the-shelf sensors and a microcontroller board costing less than USD 80. Essentially, we provide designs of an inexpensive, miniature green device that can be built into, or mounted on, any pair of glasses or even a wheelchair to help the visually impaired. Our approach enables faster inference and decision-making using relatively low energy with smaller data sizes, as well as faster communications for edge, fog, and cloud computing.
first_indexed 2024-03-09T21:10:06Z
format Article
id doaj.art-6965cb52114a4d9ea8179a774dc05d93
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T21:10:06Z
publishDate 2022-09-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-6965cb52114a4d9ea8179a774dc05d932023-11-23T21:49:20ZengMDPI AGSensors1424-82202022-09-012219743510.3390/s22197435LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually ImpairedSahar Busaeed0Iyad Katib1Aiiad Albeshri2Juan M. Corchado3Tan Yigitcanlar4Rashid Mehmood5Faculty of Computer and Information Sciences, Imam Mohammad Ibn Saud Islamic University, Riyadh 11564, Saudi ArabiaDepartment of Computer Science, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah 21589, Saudi ArabiaDepartment of Computer Science, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah 21589, Saudi ArabiaBisite Research Group, University of Salamanca, 37007 Salamanca, SpainSchool of Architecture and Built Environment, Queensland University of Technology, 2 George Street, Brisbane, QLD 4000, AustraliaHigh Performance Computing Center, King Abdulaziz University, Jeddah 21589, Saudi ArabiaOver a billion people around the world are disabled, among whom 253 million are visually impaired or blind, and this number is greatly increasing due to ageing, chronic diseases, and poor environments and health. Despite many proposals, the current devices and systems lack maturity and do not completely fulfill user requirements and satisfaction. Increased research activity in this field is required in order to encourage the development, commercialization, and widespread acceptance of low-cost and affordable assistive technologies for visual impairment and other disabilities. This paper proposes a novel approach using a LiDAR with a servo motor and an ultrasonic sensor to collect data and predict objects using deep learning for environment perception and navigation. We adopted this approach using a pair of smart glasses, called LidSonic V2.0, to enable the identification of obstacles for the visually impaired. The LidSonic system consists of an Arduino Uno edge computing device integrated into the smart glasses and a smartphone app that transmits data via Bluetooth. Arduino gathers data, operates the sensors on the smart glasses, detects obstacles using simple data processing, and provides buzzer feedback to visually impaired users. The smartphone application collects data from Arduino, detects and classifies items in the spatial environment, and gives spoken feedback to the user on the detected objects. In comparison to image-processing-based glasses, LidSonic uses far less processing time and energy to classify obstacles using simple LiDAR data, according to several integer measurements. We comprehensively describe the proposed system’s hardware and software design, having constructed their prototype implementations and tested them in real-world environments. Using the open platforms, WEKA and TensorFlow, the entire LidSonic system is built with affordable off-the-shelf sensors and a microcontroller board costing less than USD 80. Essentially, we provide designs of an inexpensive, miniature green device that can be built into, or mounted on, any pair of glasses or even a wheelchair to help the visually impaired. Our approach enables faster inference and decision-making using relatively low energy with smaller data sizes, as well as faster communications for edge, fog, and cloud computing.https://www.mdpi.com/1424-8220/22/19/7435visually impairedsmart mobilitysensorsLiDARultrasonicdeep learning
spellingShingle Sahar Busaeed
Iyad Katib
Aiiad Albeshri
Juan M. Corchado
Tan Yigitcanlar
Rashid Mehmood
LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
Sensors
visually impaired
smart mobility
sensors
LiDAR
ultrasonic
deep learning
title LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
title_full LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
title_fullStr LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
title_full_unstemmed LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
title_short LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired
title_sort lidsonic v2 0 a lidar and deep learning based green assistive edge device to enhance mobility for the visually impaired
topic visually impaired
smart mobility
sensors
LiDAR
ultrasonic
deep learning
url https://www.mdpi.com/1424-8220/22/19/7435
work_keys_str_mv AT saharbusaeed lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired
AT iyadkatib lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired
AT aiiadalbeshri lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired
AT juanmcorchado lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired
AT tanyigitcanlar lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired
AT rashidmehmood lidsonicv20alidaranddeeplearningbasedgreenassistiveedgedevicetoenhancemobilityforthevisuallyimpaired