Hug behavior response model for generation of hug behavior with humans

In human face-to-face communication, embodied sharing using the synchronization of embodied rhythms is promoted by embodied interactions. Therefore, embodied interactions are critical for smoothly initiating coexistence and communication. In particular, hug behavior can effectively promote the synch...

Full description

Bibliographic Details
Main Authors: Mitsuru JINDAI, Shunsuke OTA, Toshiyuki YASUDA, Tohru SASAKI
Format: Article
Language:English
Published: The Japan Society of Mechanical Engineers 2018-04-01
Series:Journal of Advanced Mechanical Design, Systems, and Manufacturing
Subjects:
Online Access:https://www.jstage.jst.go.jp/article/jamdsm/12/2/12_2018jamdsm0035/_pdf/-char/en
Description
Summary:In human face-to-face communication, embodied sharing using the synchronization of embodied rhythms is promoted by embodied interactions. Therefore, embodied interactions are critical for smoothly initiating coexistence and communication. In particular, hug behavior can effectively promote the synchronization of embodied rhythms, being one of the types of embodied interactions wherein humans have whole-body contact with each other. In the case of a human and a robot, it is likely that the robot could effectively synchronize an embodied rhythm with a human using hug behavior. In the authors' previous study, a behavior model for the generation of hug behavior with humans was proposed. Furthermore, the timing of a human's voice greeting was found to be critical to the generation of smooth arm motion by a robot. However, this model generates only arm motions for hugs. It is not capable of generating an overall flow of hug behavior. Therefore, this study proposes a hug behavior response model that generates response behaviors when humans request hugs while approaching the robot. Furthermore, a hug robot system that uses the proposed model is developed. The effectiveness of the proposed hug behavior response model is demonstrated by sensory evaluations using the robot system.
ISSN:1881-3054