Modeling the Dynamics of Nonverbal Behavior on Interpersonal Trust for Human-Robot Interactions
We describe research towards creating a computational model for recognizing interpersonal trust in social interactions. We found that four negative gestural cues—leaning-backward, face-touching, hand-touching, and crossing-arms—are together predictive of lower levels of trust. Three positive gestura...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | en_US |
Published: |
Association for the Advancement of Artificial Intelligence
2014
|
Online Access: | http://hdl.handle.net/1721.1/92378 https://orcid.org/0000-0002-0587-2065 https://orcid.org/0000-0003-1175-437X |
Summary: | We describe research towards creating a computational model for recognizing interpersonal trust in social interactions. We found that four negative gestural cues—leaning-backward, face-touching, hand-touching, and crossing-arms—are together predictive of lower levels of trust. Three positive gestural cues—leaning-forward, having arms-in-lap, and open-arms—are predictive of higher levels of trust. We train a probabilistic graphical model using natural social interaction data, a “Trust Hidden Markov Model” that incorporates the occurrence of these seven important gestures throughout the social interaction. This Trust HMM predicts with 69.44% accuracy whether an individual is willing to behave cooperatively or uncooperatively with their novel partner; in comparison, a gesture-ignorant model achieves 63.89% accuracy. We attempt to automate this recognition process by detecting those trust-related behaviors through 3D motion capture technology and gesture recognition algorithms. We aim to eventually create a hierarchical system—with low-level gesture recognition for high-level trust recognition—that is capable of predicting whether an individual finds another to be a trustworthy or untrustworthy partner through their nonverbal expressions. |
---|