Face2Gesture: Translating Facial Expressions Into Robot Movements Through Shared Latent Space Neural Networks
In this work, we present a method for personalizing human-robot interaction by using emotive facial expressions to generate affective robot movements. Movement is an important medium for robots to communicate affective states, but the expertise and time required to craft new robot movements promotes...
Main Authors: | Suguitan, Michael, DePalma, Nicholas, Hoffman, Guy, Hodgins, Jessica |
---|---|
Other Authors: | Massachusetts Institute of Technology. Media Laboratory |
Format: | Article |
Language: | English |
Published: |
ACM
2023
|
Online Access: | https://hdl.handle.net/1721.1/152915 |
Similar Items
-
Toward a One-interaction Data-driven Guide: Putting co-Speech Gesture Evidence to Work for Ambiguous Route Instructions
by: DePalma, Nicholas, et al.
Published: (2025) -
Acume: A New Visualization Tool for Understanding Facial Expression and Gesture Data
by: McDuff, Daniel Jonathan, et al.
Published: (2011) -
Real-Time Inference of Mental States from Facial Expressions and Upper Body Gestures
by: Baltrusaitis, Tadas, et al.
Published: (2011) -
FaceFacts : study of facial features for understanding expression
by: Choudhury, Tanzeem Khalid, 1975-
Published: (2011) -
Latent-Dynamic Discriminative Models for Continuous Gesture Recognition
by: Morency, Louis-Philippe, et al.
Published: (2007)