Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning

Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the...

Full description

Bibliographic Details
Main Authors: Jaekwang Cha, Jinhyuk Kim, Shiho Kim
Format: Article
Language:English
Published: MDPI AG 2019-10-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/19/20/4441
_version_ 1811183920684728320
author Jaekwang Cha
Jinhyuk Kim
Shiho Kim
author_facet Jaekwang Cha
Jinhyuk Kim
Shiho Kim
author_sort Jaekwang Cha
collection DOAJ
description Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.
first_indexed 2024-04-11T13:04:57Z
format Article
id doaj.art-51a3bf8f02524cda88b8036ade240053
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-04-11T13:04:57Z
publishDate 2019-10-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-51a3bf8f02524cda88b8036ade2400532022-12-22T04:22:48ZengMDPI AGSensors1424-82202019-10-011920444110.3390/s19204441s19204441Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep LearningJaekwang Cha0Jinhyuk Kim1Shiho Kim2Seamless Transportation Lab (STL), School of Integrated Technology, and Yonsei Institute of Convergence Technology, Yonsei University, Incheon 21983, KoreaSeamless Transportation Lab (STL), School of Integrated Technology, and Yonsei Institute of Convergence Technology, Yonsei University, Incheon 21983, KoreaSeamless Transportation Lab (STL), School of Integrated Technology, and Yonsei Institute of Convergence Technology, Yonsei University, Incheon 21983, KoreaDeveloping a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.https://www.mdpi.com/1424-8220/19/20/4441hands-free interfaceaugmented realityspatiotemporal autoencoderdeep embedded clustering
spellingShingle Jaekwang Cha
Jinhyuk Kim
Shiho Kim
Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
Sensors
hands-free interface
augmented reality
spatiotemporal autoencoder
deep embedded clustering
title Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
title_full Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
title_fullStr Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
title_full_unstemmed Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
title_short Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
title_sort hands free user interface for ar vr devices exploiting wearer s facial gestures using unsupervised deep learning
topic hands-free interface
augmented reality
spatiotemporal autoencoder
deep embedded clustering
url https://www.mdpi.com/1424-8220/19/20/4441
work_keys_str_mv AT jaekwangcha handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning
AT jinhyukkim handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning
AT shihokim handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning