Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures

Facial expression is one of the major cues for emotional communications between humans and robots. In this paper, we present emotional human robot interaction techniques using facial expressions combined with an exploration of other useful concepts, such as face pose and hand gesture. For the effici...

Full description

Bibliographic Details
Main Authors: Myung-Ho Ju, Hang-Bong Kang
Format: Article
Language:English
Published: SAGE Publishing 2012-09-01
Series:International Journal of Advanced Robotic Systems
Online Access:https://doi.org/10.5772/51615
_version_ 1818204716037832704
author Myung-Ho Ju
Hang-Bong Kang
author_facet Myung-Ho Ju
Hang-Bong Kang
author_sort Myung-Ho Ju
collection DOAJ
description Facial expression is one of the major cues for emotional communications between humans and robots. In this paper, we present emotional human robot interaction techniques using facial expressions combined with an exploration of other useful concepts, such as face pose and hand gesture. For the efficient recognition of facial expressions, it is important to understand the positions of facial feature points. To do this, our technique estimates the 3D positions of each feature point by constructing 3D face models fitted on the user. To construct the 3D face models, we first construct an Active Appearance Model (AAM) for variations of the facial expression. Next, we estimate depth information at each feature point from frontal- and side-view images. By combining the estimated depth information with AAM, the 3D face model is fitted on the user according to the various 3D transformations of each feature point. Self-occlusions due to the 3D pose variation are also processed by the region weighting function on the normalized face at each frame. The recognized facial expressions - such as happiness, sadness, fear and anger - are used to change the colours of foreground and background objects in the robot displays, as well as other robot responses. The proposed method displays desirable results in viewing comics with the entertainment robots in our experiments.
first_indexed 2024-12-12T03:45:39Z
format Article
id doaj.art-c93acb13a85548efb890c269a4fd23ef
institution Directory Open Access Journal
issn 1729-8814
language English
last_indexed 2024-12-12T03:45:39Z
publishDate 2012-09-01
publisher SAGE Publishing
record_format Article
series International Journal of Advanced Robotic Systems
spelling doaj.art-c93acb13a85548efb890c269a4fd23ef2022-12-22T00:39:34ZengSAGE PublishingInternational Journal of Advanced Robotic Systems1729-88142012-09-01910.5772/5161510.5772_51615Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand GesturesMyung-Ho Ju0Hang-Bong Kang1 Dept. of Computer Eng. Catholic University of Korea, Wonmi-Gu Buchon City, Gyonggi-Do, Korea Dept. of Media Eng. Catholic University of Korea, Wonmi-Gu Buchon City, Gyonggi-Do, KoreaFacial expression is one of the major cues for emotional communications between humans and robots. In this paper, we present emotional human robot interaction techniques using facial expressions combined with an exploration of other useful concepts, such as face pose and hand gesture. For the efficient recognition of facial expressions, it is important to understand the positions of facial feature points. To do this, our technique estimates the 3D positions of each feature point by constructing 3D face models fitted on the user. To construct the 3D face models, we first construct an Active Appearance Model (AAM) for variations of the facial expression. Next, we estimate depth information at each feature point from frontal- and side-view images. By combining the estimated depth information with AAM, the 3D face model is fitted on the user according to the various 3D transformations of each feature point. Self-occlusions due to the 3D pose variation are also processed by the region weighting function on the normalized face at each frame. The recognized facial expressions - such as happiness, sadness, fear and anger - are used to change the colours of foreground and background objects in the robot displays, as well as other robot responses. The proposed method displays desirable results in viewing comics with the entertainment robots in our experiments.https://doi.org/10.5772/51615
spellingShingle Myung-Ho Ju
Hang-Bong Kang
Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
International Journal of Advanced Robotic Systems
title Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
title_full Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
title_fullStr Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
title_full_unstemmed Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
title_short Emotional Interaction with a Robot Using Facial Expressions, Face Pose and Hand Gestures
title_sort emotional interaction with a robot using facial expressions face pose and hand gestures
url https://doi.org/10.5772/51615
work_keys_str_mv AT myunghoju emotionalinteractionwitharobotusingfacialexpressionsfaceposeandhandgestures
AT hangbongkang emotionalinteractionwitharobotusingfacialexpressionsfaceposeandhandgestures