RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision

One of the main challenges of visual prostheses is to augment the perceived information to improve the experience of its wearers. Given the limited access to implanted patients, in order to facilitate the experimentation of new techniques, this is often evaluated via Simulated Prosthetic Vision (SPV...

Full description

Bibliographic Details
Main Authors: Alejandro Perez-Yus, Maria Santos-Villafranca, Julia Tomas-Barba, Jesus Bermudez-Cameo, Lorenzo Montano-Olivan, Gonzalo Lopez-Nicolas, Jose J. Guerrero
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10411899/
_version_ 1827365889017643008
author Alejandro Perez-Yus
Maria Santos-Villafranca
Julia Tomas-Barba
Jesus Bermudez-Cameo
Lorenzo Montano-Olivan
Gonzalo Lopez-Nicolas
Jose J. Guerrero
author_facet Alejandro Perez-Yus
Maria Santos-Villafranca
Julia Tomas-Barba
Jesus Bermudez-Cameo
Lorenzo Montano-Olivan
Gonzalo Lopez-Nicolas
Jose J. Guerrero
author_sort Alejandro Perez-Yus
collection DOAJ
description One of the main challenges of visual prostheses is to augment the perceived information to improve the experience of its wearers. Given the limited access to implanted patients, in order to facilitate the experimentation of new techniques, this is often evaluated via Simulated Prosthetic Vision (SPV) with sighted people. In this work, we introduce a novel SPV framework and implementation that presents major advantages with respect to previous approaches. First, it is integrated into a robotics framework, which allows us to benefit from a wide range of methods and algorithms from the field (e.g. object recognition, obstacle avoidance, autonomous navigation, deep learning). Second, we go beyond traditional image processing with 3D point clouds processing using an RGB-D camera, allowing us to robustly detect the floor, obstacles and the structure of the scene. Third, it works either with a real camera or in a virtual environment, which gives us endless possibilities for immersive experimentation through a head-mounted display. Fourth, we incorporate a validated temporal phosphene model that replicates time effects into the generation of visual stimuli. Finally, we have proposed, developed and tested several applications within this framework, such as avoiding moving obstacles, providing a general understanding of the scene, staircase detection, helping the subject to navigate an unfamiliar space, and object and person detection. We provide experimental results in real and virtual environments. The code is publicly available at <uri>https://www.github.com/aperezyus/RASPV</uri>
first_indexed 2024-03-08T08:39:10Z
format Article
id doaj.art-4af83160fa88481a826cc9c5d26df918
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-08T08:39:10Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-4af83160fa88481a826cc9c5d26df9182024-02-02T00:02:44ZengIEEEIEEE Access2169-35362024-01-0112152511526710.1109/ACCESS.2024.335740010411899RASPV: A Robotics Framework for Augmented Simulated Prosthetic VisionAlejandro Perez-Yus0https://orcid.org/0000-0002-8949-2632Maria Santos-Villafranca1https://orcid.org/0009-0009-5663-3185Julia Tomas-Barba2Jesus Bermudez-Cameo3https://orcid.org/0000-0002-8479-1748Lorenzo Montano-Olivan4https://orcid.org/0000-0003-0190-9331Gonzalo Lopez-Nicolas5https://orcid.org/0000-0001-9347-5969Jose J. Guerrero6https://orcid.org/0000-0001-5209-2267Instituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainInstituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainInstituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainInstituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainITAINNOVA&#x2014;Instituto Tecnol&#x00F3;gico de Arag&#x00F3;n, Zaragoza, SpainInstituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainInstituto de Investigaci&#x00F3;n en Ingenier&#x00ED;a de Arag&#x00F3;n (I3A), University of Zaragoza, Zaragoza, SpainOne of the main challenges of visual prostheses is to augment the perceived information to improve the experience of its wearers. Given the limited access to implanted patients, in order to facilitate the experimentation of new techniques, this is often evaluated via Simulated Prosthetic Vision (SPV) with sighted people. In this work, we introduce a novel SPV framework and implementation that presents major advantages with respect to previous approaches. First, it is integrated into a robotics framework, which allows us to benefit from a wide range of methods and algorithms from the field (e.g. object recognition, obstacle avoidance, autonomous navigation, deep learning). Second, we go beyond traditional image processing with 3D point clouds processing using an RGB-D camera, allowing us to robustly detect the floor, obstacles and the structure of the scene. Third, it works either with a real camera or in a virtual environment, which gives us endless possibilities for immersive experimentation through a head-mounted display. Fourth, we incorporate a validated temporal phosphene model that replicates time effects into the generation of visual stimuli. Finally, we have proposed, developed and tested several applications within this framework, such as avoiding moving obstacles, providing a general understanding of the scene, staircase detection, helping the subject to navigate an unfamiliar space, and object and person detection. We provide experimental results in real and virtual environments. The code is publicly available at <uri>https://www.github.com/aperezyus/RASPV</uri>https://ieeexplore.ieee.org/document/10411899/Computer visionnavigationRGB-Dsimulated prosthetic visionvisually impaired assistance
spellingShingle Alejandro Perez-Yus
Maria Santos-Villafranca
Julia Tomas-Barba
Jesus Bermudez-Cameo
Lorenzo Montano-Olivan
Gonzalo Lopez-Nicolas
Jose J. Guerrero
RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
IEEE Access
Computer vision
navigation
RGB-D
simulated prosthetic vision
visually impaired assistance
title RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
title_full RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
title_fullStr RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
title_full_unstemmed RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
title_short RASPV: A Robotics Framework for Augmented Simulated Prosthetic Vision
title_sort raspv a robotics framework for augmented simulated prosthetic vision
topic Computer vision
navigation
RGB-D
simulated prosthetic vision
visually impaired assistance
url https://ieeexplore.ieee.org/document/10411899/
work_keys_str_mv AT alejandroperezyus raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT mariasantosvillafranca raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT juliatomasbarba raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT jesusbermudezcameo raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT lorenzomontanoolivan raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT gonzalolopeznicolas raspvaroboticsframeworkforaugmentedsimulatedprostheticvision
AT josejguerrero raspvaroboticsframeworkforaugmentedsimulatedprostheticvision