Face2Gesture: Translating Facial Expressions Into Robot Movements Through Shared Latent Space Neural Networks
In this work, we present a method for personalizing human-robot interaction by using emotive facial expressions to generate affective robot movements. Movement is an important medium for robots to communicate affective states, but the expertise and time required to craft new robot movements promotes...
Main Authors: | Suguitan, Michael, DePalma, Nicholas, Hoffman, Guy, Hodgins, Jessica |
---|---|
Other Authors: | Massachusetts Institute of Technology. Media Laboratory |
Format: | Article |
Language: | English |
Published: |
ACM
2023
|
Online Access: | https://hdl.handle.net/1721.1/152915 |
Similar Items
-
Transcribing Facial Gestures
by: Carolin Dix
Published: (2024-02-01) -
Facial Gestures in Social Interaction
by: Alexandra Groß, et al.
Published: (2024-02-01) -
Bidirectional gaze guiding and indexing in human-robot interaction through a situated architecture
by: DePalma, Nicholas Brian
Published: (2017) -
Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot
by: Yi Han, et al.
Published: (2022-12-01) -
Shared silence: writing gestures and presence gestures
by: Luiza Crosman
Published: (2015-12-01)