Human-Machine Cooperative Echolocation Using Ultrasound
Echolocation has been shown to improve the independence of visually impaired people, and utilizing ultrasound in echolocation offers additional advantages, such as a higher resolution of object sensing and ease of extraction from background sounds. However, humans cannot innately make and hear ultra...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9963533/ |
_version_ | 1811311633881890816 |
---|---|
author | Hiroki Watanabe Miwa Sumiya Tsutomu Terada |
author_facet | Hiroki Watanabe Miwa Sumiya Tsutomu Terada |
author_sort | Hiroki Watanabe |
collection | DOAJ |
description | Echolocation has been shown to improve the independence of visually impaired people, and utilizing ultrasound in echolocation offers additional advantages, such as a higher resolution of object sensing and ease of extraction from background sounds. However, humans cannot innately make and hear ultrasound. A wearable device that enables ultrasonic echolocation, i.e., that transmits ultrasound through an ultrasonic speaker and converts the reflected ultrasound into audible sound, has therefore been attracting interest. Such a system can be utilized with machine learning (ML) to help visually impaired users recognize objects. We have therefore been developing a cooperative echolocation system that combines human recognition with ML recognition. As the first step toward cooperative echolocation, this paper presents the effectiveness of ML in echolocation. We implemented a prototype device and evaluated the performance of object detection with/without ML and found that the mental workload on the user was significantly decreased when ML was used. Based on the findings from the evaluation, we discussed the design of cooperative echolocation. |
first_indexed | 2024-04-13T10:22:14Z |
format | Article |
id | doaj.art-c5c911d686674937a7da20041156ec51 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-13T10:22:14Z |
publishDate | 2022-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-c5c911d686674937a7da20041156ec512022-12-22T02:50:28ZengIEEEIEEE Access2169-35362022-01-011012526412527810.1109/ACCESS.2022.32244689963533Human-Machine Cooperative Echolocation Using UltrasoundHiroki Watanabe0https://orcid.org/0000-0002-6854-4448Miwa Sumiya1https://orcid.org/0000-0002-8505-6294Tsutomu Terada2https://orcid.org/0000-0003-2260-3788Graduate School of Information Science and Technology, Hokkaido University, Sapporo, JapanJapan Society for the Promotion of Science, Tokyo, JapanGraduate School of Engineering, Kobe University, Kobe, JapanEcholocation has been shown to improve the independence of visually impaired people, and utilizing ultrasound in echolocation offers additional advantages, such as a higher resolution of object sensing and ease of extraction from background sounds. However, humans cannot innately make and hear ultrasound. A wearable device that enables ultrasonic echolocation, i.e., that transmits ultrasound through an ultrasonic speaker and converts the reflected ultrasound into audible sound, has therefore been attracting interest. Such a system can be utilized with machine learning (ML) to help visually impaired users recognize objects. We have therefore been developing a cooperative echolocation system that combines human recognition with ML recognition. As the first step toward cooperative echolocation, this paper presents the effectiveness of ML in echolocation. We implemented a prototype device and evaluated the performance of object detection with/without ML and found that the mental workload on the user was significantly decreased when ML was used. Based on the findings from the evaluation, we discussed the design of cooperative echolocation.https://ieeexplore.ieee.org/document/9963533/Assistive technologyecholocationobject recognitionultrasoundwearable computing |
spellingShingle | Hiroki Watanabe Miwa Sumiya Tsutomu Terada Human-Machine Cooperative Echolocation Using Ultrasound IEEE Access Assistive technology echolocation object recognition ultrasound wearable computing |
title | Human-Machine Cooperative Echolocation Using Ultrasound |
title_full | Human-Machine Cooperative Echolocation Using Ultrasound |
title_fullStr | Human-Machine Cooperative Echolocation Using Ultrasound |
title_full_unstemmed | Human-Machine Cooperative Echolocation Using Ultrasound |
title_short | Human-Machine Cooperative Echolocation Using Ultrasound |
title_sort | human machine cooperative echolocation using ultrasound |
topic | Assistive technology echolocation object recognition ultrasound wearable computing |
url | https://ieeexplore.ieee.org/document/9963533/ |
work_keys_str_mv | AT hirokiwatanabe humanmachinecooperativeecholocationusingultrasound AT miwasumiya humanmachinecooperativeecholocationusingultrasound AT tsutomuterada humanmachinecooperativeecholocationusingultrasound |