Estimation of Confidence in the Dialogue based on Eye Gaze and Head Movement Information

In human-robot interaction, human mental states in dialogue have attracted attention to human-friendly robots that support educational use. Although estimating mental states using speech and visual information has been conducted, it is still challenging to estimate mental states more precisely in t...

Full description

Bibliographic Details
Main Authors: Cui Dewen, Matsufuji Akihiro, Liu Yi, Eri Sato- Shimokawa, Toru Yamaguchi
Format: Article
Language:English
Published: Politeknik Elektronika Negeri Surabaya 2022-12-01
Series:Emitter: International Journal of Engineering Technology
Subjects:
Online Access:http://emitter.pens.ac.id/index.php/emitter/article/view/756
Description
Summary:In human-robot interaction, human mental states in dialogue have attracted attention to human-friendly robots that support educational use. Although estimating mental states using speech and visual information has been conducted, it is still challenging to estimate mental states more precisely in the educational scene. In this paper, we proposed a method to estimate human mental state based on participants’ eye gaze and head movement information. Estimated participants’ confidence levels in their answers to the miscellaneous knowledge question as a human mental state. The participants’ non-verbal information, such as eye gaze and head movements during dialog with a robot, were collected in our experiment using an eye-tracking device. Then we collect participants’ confidence levels and analyze the relationship between human mental state and non-verbal information. Furthermore, we also applied a machine learning technique to estimate participants’ confidence levels from extracted features of gaze and head movement information. As a result, the performance of a machine learning technique using gaze and head movements information achieved over 80 % accuracy in estimating confidence levels. Our research provides insight into developing a human-friendly robot considering human mental states in the dialogue.
ISSN:2355-391X
2443-1168