Body Gesture and Head Movement Analyses in Dyadic Parent-Child Interaction as Indicators of Relationship
Main Authors: | Alghowinem, Sharifa, Chen, Huili, Breazeal, Cynthia, Park, Hae Won |
---|---|
Other Authors: | Program in Media Arts and Sciences (Massachusetts Institute of Technology) |
Format: | Article |
Language: | English |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2023
|
Online Access: | https://hdl.handle.net/1721.1/147116 |
Similar Items
-
Dyadic Affect in Parent-child Multi-modal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis
by: Chen, Huili, et al.
Published: (2022) -
Integrating Flow Theory and Adaptive Robot Roles: A Conceptual Model of Dynamic Robot Role Adaptation for the Enhanced Flow Experience in Long-term Multi-person Human-Robot Interactions
by: Chen, Huili, et al.
Published: (2024) -
Impact of Interaction Context on the Student Affect-Learning Relationship in Child-Robot Interaction
by: Chen, Huili, et al.
Published: (2021) -
Multimodal region-based behavioral modeling for suicide risk screening
by: Alghowinem, Sharifa, et al.
Published: (2023) -
Dyadic Speech-based Affect Recognition using DAMI-P2C Parent-child Multimodal Interaction Dataset
by: Chen, H, et al.
Published: (2021)