Face2Gesture: Translating Facial Expressions Into Robot Movements Through Shared Latent Space Neural Networks
In this work, we present a method for personalizing human-robot interaction by using emotive facial expressions to generate affective robot movements. Movement is an important medium for robots to communicate affective states, but the expertise and time required to craft new robot movements promotes...
Main Authors: | Suguitan, Michael, DePalma, Nicholas, Hoffman, Guy, Hodgins, Jessica |
---|---|
Other Authors: | Massachusetts Institute of Technology. Media Laboratory |
Format: | Article |
Language: | English |
Published: |
ACM
2023
|
Online Access: | https://hdl.handle.net/1721.1/152915 |
Similar Items
-
Toward a One-interaction Data-driven Guide: Putting co-Speech Gesture Evidence to Work for Ambiguous Route Instructions
by: DePalma, Nicholas, et al.
Published: (2025) -
Bidirectional gaze guiding and indexing in human-robot interaction through a situated architecture
by: DePalma, Nicholas Brian
Published: (2017) -
Latent-Dynamic Discriminative Models for Continuous Gesture Recognition
by: Morency, Louis-Philippe, et al.
Published: (2007) -
Crowd-Sourcing Real-World Human-Robot Dialogue and Teamwork through Online Multiplayer Games
by: Chernova, Sonia, et al.
Published: (2017) -
Gesture learning in social robots
by: Shen, Jiayu.
Published: (2011)